Mar 07 04:19:13 crc systemd[1]: Starting Kubernetes Kubelet... Mar 07 04:19:13 crc restorecon[4688]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 04:19:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 04:19:14 crc restorecon[4688]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 04:19:14 crc restorecon[4688]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 07 04:19:15 crc kubenswrapper[4689]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 04:19:15 crc kubenswrapper[4689]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 07 04:19:15 crc kubenswrapper[4689]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 04:19:15 crc kubenswrapper[4689]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 04:19:15 crc kubenswrapper[4689]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 07 04:19:15 crc kubenswrapper[4689]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.536281 4689 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544385 4689 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544420 4689 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544434 4689 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544446 4689 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544457 4689 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544467 4689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544476 4689 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544484 4689 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544493 4689 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544501 4689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544510 4689 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544518 4689 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544527 4689 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544548 4689 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544557 4689 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544565 4689 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544573 4689 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544581 4689 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544592 4689 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544601 4689 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544608 4689 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544616 4689 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544624 4689 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544632 4689 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544639 4689 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544647 4689 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544655 4689 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544662 4689 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544670 4689 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544678 4689 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544686 4689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544695 4689 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544704 4689 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544714 4689 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544723 4689 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544732 4689 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544741 4689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544749 4689 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544757 4689 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544765 4689 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544776 4689 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544786 4689 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544795 4689 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544804 4689 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544812 4689 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544821 4689 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544829 4689 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544839 4689 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544848 4689 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544857 4689 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544866 4689 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544875 4689 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544883 4689 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544891 4689 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544900 4689 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544908 4689 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544916 4689 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544924 4689 feature_gate.go:330] unrecognized feature gate: Example Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544932 4689 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544944 4689 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544953 4689 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544962 4689 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544982 4689 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544990 4689 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.544999 4689 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.545007 4689 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.545015 4689 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.545023 4689 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.545030 4689 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.545038 4689 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.545046 4689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546234 4689 flags.go:64] FLAG: --address="0.0.0.0" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546260 4689 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546277 4689 flags.go:64] FLAG: --anonymous-auth="true" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546289 4689 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546301 4689 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546310 4689 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546322 4689 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546334 4689 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546343 4689 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546354 4689 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546363 4689 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546372 4689 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546382 4689 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546391 4689 flags.go:64] FLAG: --cgroup-root="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546400 4689 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546409 4689 flags.go:64] FLAG: --client-ca-file="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546417 4689 flags.go:64] FLAG: --cloud-config="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546426 4689 flags.go:64] FLAG: --cloud-provider="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546439 4689 flags.go:64] FLAG: --cluster-dns="[]" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546465 4689 flags.go:64] FLAG: --cluster-domain="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546473 4689 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546482 4689 flags.go:64] FLAG: --config-dir="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546491 4689 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546501 4689 flags.go:64] FLAG: --container-log-max-files="5" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546513 4689 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546522 4689 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546531 4689 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546547 4689 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546557 4689 flags.go:64] FLAG: --contention-profiling="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546566 4689 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546575 4689 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546585 4689 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546593 4689 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546605 4689 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546614 4689 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546623 4689 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546631 4689 flags.go:64] FLAG: --enable-load-reader="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546640 4689 flags.go:64] FLAG: --enable-server="true" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546649 4689 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546664 4689 flags.go:64] FLAG: --event-burst="100" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546674 4689 flags.go:64] FLAG: --event-qps="50" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546683 4689 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546692 4689 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546701 4689 flags.go:64] FLAG: --eviction-hard="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546711 4689 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546721 4689 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546730 4689 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546739 4689 flags.go:64] FLAG: --eviction-soft="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546748 4689 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546757 4689 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546766 4689 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546776 4689 flags.go:64] FLAG: --experimental-mounter-path="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546784 4689 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546793 4689 flags.go:64] FLAG: --fail-swap-on="true" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546802 4689 flags.go:64] FLAG: --feature-gates="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546813 4689 flags.go:64] FLAG: --file-check-frequency="20s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546822 4689 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546831 4689 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546841 4689 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546850 4689 flags.go:64] FLAG: --healthz-port="10248" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546859 4689 flags.go:64] FLAG: --help="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546868 4689 flags.go:64] FLAG: --hostname-override="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546876 4689 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546887 4689 flags.go:64] FLAG: --http-check-frequency="20s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546896 4689 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546906 4689 flags.go:64] FLAG: --image-credential-provider-config="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546915 4689 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546924 4689 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546932 4689 flags.go:64] FLAG: --image-service-endpoint="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546941 4689 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546950 4689 flags.go:64] FLAG: --kube-api-burst="100" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546959 4689 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546969 4689 flags.go:64] FLAG: --kube-api-qps="50" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546978 4689 flags.go:64] FLAG: --kube-reserved="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546987 4689 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.546995 4689 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547004 4689 flags.go:64] FLAG: --kubelet-cgroups="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547013 4689 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547022 4689 flags.go:64] FLAG: --lock-file="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547031 4689 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547040 4689 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547049 4689 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547064 4689 flags.go:64] FLAG: --log-json-split-stream="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547075 4689 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547084 4689 flags.go:64] FLAG: --log-text-split-stream="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547092 4689 flags.go:64] FLAG: --logging-format="text" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547101 4689 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547111 4689 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547120 4689 flags.go:64] FLAG: --manifest-url="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547129 4689 flags.go:64] FLAG: --manifest-url-header="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547140 4689 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547149 4689 flags.go:64] FLAG: --max-open-files="1000000" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547160 4689 flags.go:64] FLAG: --max-pods="110" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547196 4689 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547206 4689 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547214 4689 flags.go:64] FLAG: --memory-manager-policy="None" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547223 4689 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547232 4689 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547242 4689 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547253 4689 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547275 4689 flags.go:64] FLAG: --node-status-max-images="50" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547284 4689 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547293 4689 flags.go:64] FLAG: --oom-score-adj="-999" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547303 4689 flags.go:64] FLAG: --pod-cidr="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547312 4689 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547326 4689 flags.go:64] FLAG: --pod-manifest-path="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547335 4689 flags.go:64] FLAG: --pod-max-pids="-1" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547344 4689 flags.go:64] FLAG: --pods-per-core="0" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547354 4689 flags.go:64] FLAG: --port="10250" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547364 4689 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547374 4689 flags.go:64] FLAG: --provider-id="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547384 4689 flags.go:64] FLAG: --qos-reserved="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547393 4689 flags.go:64] FLAG: --read-only-port="10255" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547403 4689 flags.go:64] FLAG: --register-node="true" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547412 4689 flags.go:64] FLAG: --register-schedulable="true" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547421 4689 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547436 4689 flags.go:64] FLAG: --registry-burst="10" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547445 4689 flags.go:64] FLAG: --registry-qps="5" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547454 4689 flags.go:64] FLAG: --reserved-cpus="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547463 4689 flags.go:64] FLAG: --reserved-memory="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547474 4689 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547484 4689 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547493 4689 flags.go:64] FLAG: --rotate-certificates="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547502 4689 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547511 4689 flags.go:64] FLAG: --runonce="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547520 4689 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547529 4689 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547538 4689 flags.go:64] FLAG: --seccomp-default="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547548 4689 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547557 4689 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547566 4689 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547575 4689 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547584 4689 flags.go:64] FLAG: --storage-driver-password="root" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547593 4689 flags.go:64] FLAG: --storage-driver-secure="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547603 4689 flags.go:64] FLAG: --storage-driver-table="stats" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547613 4689 flags.go:64] FLAG: --storage-driver-user="root" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547622 4689 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547631 4689 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547640 4689 flags.go:64] FLAG: --system-cgroups="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547649 4689 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547662 4689 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547671 4689 flags.go:64] FLAG: --tls-cert-file="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547680 4689 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547690 4689 flags.go:64] FLAG: --tls-min-version="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547701 4689 flags.go:64] FLAG: --tls-private-key-file="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547709 4689 flags.go:64] FLAG: --topology-manager-policy="none" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547719 4689 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547728 4689 flags.go:64] FLAG: --topology-manager-scope="container" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547737 4689 flags.go:64] FLAG: --v="2" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547750 4689 flags.go:64] FLAG: --version="false" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547761 4689 flags.go:64] FLAG: --vmodule="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547772 4689 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.547782 4689 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548024 4689 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548036 4689 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548045 4689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548054 4689 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548063 4689 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548072 4689 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548080 4689 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548088 4689 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548096 4689 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548105 4689 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548115 4689 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548126 4689 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548136 4689 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548145 4689 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548154 4689 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548162 4689 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548198 4689 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548206 4689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548218 4689 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548228 4689 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548236 4689 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548245 4689 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548254 4689 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548268 4689 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548276 4689 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548285 4689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548293 4689 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548301 4689 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548310 4689 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548318 4689 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548326 4689 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548334 4689 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548342 4689 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548351 4689 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548360 4689 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548368 4689 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548377 4689 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548386 4689 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548396 4689 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548405 4689 feature_gate.go:330] unrecognized feature gate: Example Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548414 4689 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548423 4689 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548432 4689 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548441 4689 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548451 4689 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548461 4689 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548472 4689 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548481 4689 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548491 4689 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548501 4689 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548509 4689 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548518 4689 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548526 4689 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548534 4689 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548552 4689 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548563 4689 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548571 4689 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548578 4689 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548586 4689 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548594 4689 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548602 4689 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548609 4689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548617 4689 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548625 4689 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548633 4689 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548640 4689 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548648 4689 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548655 4689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548663 4689 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548671 4689 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.548678 4689 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.548701 4689 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.565275 4689 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.565332 4689 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565480 4689 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565495 4689 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565505 4689 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565515 4689 feature_gate.go:330] unrecognized feature gate: Example Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565523 4689 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565532 4689 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565545 4689 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565556 4689 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565566 4689 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565577 4689 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565590 4689 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565600 4689 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565609 4689 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565617 4689 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565626 4689 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565633 4689 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565641 4689 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565649 4689 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565657 4689 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565665 4689 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565673 4689 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565681 4689 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565689 4689 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565697 4689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565705 4689 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565713 4689 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565720 4689 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565728 4689 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565736 4689 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565744 4689 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565752 4689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565761 4689 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565774 4689 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565784 4689 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565793 4689 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565802 4689 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565811 4689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565820 4689 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565828 4689 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565839 4689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565850 4689 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565860 4689 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565870 4689 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565879 4689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565888 4689 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565916 4689 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565925 4689 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565933 4689 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565941 4689 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565950 4689 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565958 4689 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565967 4689 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565976 4689 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565984 4689 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565991 4689 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.565999 4689 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566008 4689 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566016 4689 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566028 4689 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566037 4689 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566046 4689 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566055 4689 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566063 4689 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566071 4689 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566079 4689 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566086 4689 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566094 4689 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566102 4689 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566110 4689 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566117 4689 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566149 4689 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.566164 4689 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566435 4689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566448 4689 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566457 4689 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566468 4689 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566476 4689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566485 4689 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566493 4689 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566501 4689 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566509 4689 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566517 4689 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566524 4689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566533 4689 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566541 4689 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566549 4689 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566557 4689 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566567 4689 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566578 4689 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566588 4689 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566596 4689 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566606 4689 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566614 4689 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566623 4689 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566631 4689 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566640 4689 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566649 4689 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566659 4689 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566668 4689 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566676 4689 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566684 4689 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566692 4689 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566701 4689 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566708 4689 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566716 4689 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566724 4689 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566732 4689 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566741 4689 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566751 4689 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566760 4689 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566770 4689 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566779 4689 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566788 4689 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566796 4689 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566804 4689 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566812 4689 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566821 4689 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566829 4689 feature_gate.go:330] unrecognized feature gate: Example Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566837 4689 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566846 4689 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566854 4689 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566862 4689 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566872 4689 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566883 4689 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566892 4689 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566899 4689 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566907 4689 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566915 4689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566923 4689 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566931 4689 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566939 4689 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566947 4689 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566955 4689 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566963 4689 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566970 4689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566978 4689 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566985 4689 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.566993 4689 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.567001 4689 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.567010 4689 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.567018 4689 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.567026 4689 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.567033 4689 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.567046 4689 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.568141 4689 server.go:940] "Client rotation is on, will bootstrap in background" Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.572576 4689 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.577703 4689 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.577852 4689 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.582016 4689 server.go:997] "Starting client certificate rotation" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.582097 4689 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.583090 4689 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.617767 4689 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.620088 4689 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.621189 4689 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.640494 4689 log.go:25] "Validated CRI v1 runtime API" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.675729 4689 log.go:25] "Validated CRI v1 image API" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.680920 4689 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.686886 4689 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-07-04-15-26-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.686944 4689 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.712668 4689 manager.go:217] Machine: {Timestamp:2026-03-07 04:19:15.708835909 +0000 UTC m=+0.755219418 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6d63441d-81be-4ce6-837e-b2f91e86c31f BootID:0f61933f-c340-4249-a24a-1d8f57f94460 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:5a:67:62 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:5a:67:62 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:3b:4c:17 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a0:9a:28 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:3d:a7:10 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:9e:2e:f2 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:de:7b:99:8e:3f:e3 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fe:5f:16:22:b9:bf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.712956 4689 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.713247 4689 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.714786 4689 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.715005 4689 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.715046 4689 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.715339 4689 topology_manager.go:138] "Creating topology manager with none policy" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.715351 4689 container_manager_linux.go:303] "Creating device plugin manager" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.715938 4689 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.715970 4689 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.719111 4689 state_mem.go:36] "Initialized new in-memory state store" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.719239 4689 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.724890 4689 kubelet.go:418] "Attempting to sync node with API server" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.724927 4689 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.724977 4689 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.724997 4689 kubelet.go:324] "Adding apiserver pod source" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.725017 4689 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.730012 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.730334 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.730401 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.730528 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.730836 4689 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.731796 4689 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.733428 4689 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.736754 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.736780 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.736787 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.736795 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.736805 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.736813 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.736820 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.736832 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.736840 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.736849 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.736882 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.736891 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.737869 4689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.738504 4689 server.go:1280] "Started kubelet" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.738790 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.739524 4689 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.739598 4689 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.740435 4689 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 07 04:19:15 crc systemd[1]: Started Kubernetes Kubelet. Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.741781 4689 server.go:460] "Adding debug handlers to kubelet server" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.742902 4689 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.742971 4689 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.743591 4689 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.743622 4689 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.743766 4689 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.743857 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.744297 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.744448 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.756429 4689 factory.go:55] Registering systemd factory Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.756489 4689 factory.go:221] Registration of the systemd container factory successfully Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.757291 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.761255 4689 factory.go:153] Registering CRI-O factory Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.761309 4689 factory.go:221] Registration of the crio container factory successfully Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.761606 4689 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.761659 4689 factory.go:103] Registering Raw factory Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.761687 4689 manager.go:1196] Started watching for new ooms in manager Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.763362 4689 manager.go:319] Starting recovery of all containers Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.758086 4689 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189a7430ad4d00fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.738468605 +0000 UTC m=+0.784852104,LastTimestamp:2026-03-07 04:19:15.738468605 +0000 UTC m=+0.784852104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774386 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774466 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774490 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774511 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774532 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774553 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774574 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774596 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774620 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774644 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774663 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774683 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774705 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774733 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774756 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774774 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774803 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774822 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774843 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774863 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774896 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774920 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.774945 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775011 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775035 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775081 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775107 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775167 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775225 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775247 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775266 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775288 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775311 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775334 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775354 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775376 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775395 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775417 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775436 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775454 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775474 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775499 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775520 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775540 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775560 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775580 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775600 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775620 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775639 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775660 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775679 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775697 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775759 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775785 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775808 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775831 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775852 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775875 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775897 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775916 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775937 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775957 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.775982 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776001 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776021 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776040 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776059 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776080 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776097 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776117 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776135 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776152 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776198 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776220 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776241 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776260 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776282 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776299 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776320 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776339 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776361 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776381 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776399 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776416 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776437 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.776457 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.778681 4689 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.778741 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.778764 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.778792 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.778824 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.778849 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.778880 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.778913 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.778938 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.778961 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.778983 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779003 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779025 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779046 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779067 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779089 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779111 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779132 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779151 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779214 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779246 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779292 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779319 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779340 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779368 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779393 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779419 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779443 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779470 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779498 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779519 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779541 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779568 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779590 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779611 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779666 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779688 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779708 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779731 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779752 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779772 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779790 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779812 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779833 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779852 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779873 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779895 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779916 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779936 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779955 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779974 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.779994 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780016 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780037 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780059 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780079 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780099 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780118 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780139 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780158 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780207 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780226 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780245 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780266 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780287 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780306 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780327 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780347 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780376 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780403 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780433 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780457 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780477 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780509 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780529 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780548 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780570 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780590 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780610 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780628 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780648 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780667 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780689 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780711 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780736 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780759 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780782 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780802 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780824 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780842 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780861 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780881 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780901 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780920 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780943 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780964 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.780982 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781001 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781020 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781039 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781057 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781075 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781095 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781115 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781133 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781153 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781234 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781258 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781280 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781300 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781320 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781338 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781360 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781381 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781431 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781452 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781471 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781493 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781515 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781536 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781571 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781595 4689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781613 4689 reconstruct.go:97] "Volume reconstruction finished" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.781627 4689 reconciler.go:26] "Reconciler: start to sync state" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.797619 4689 manager.go:324] Recovery completed Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.815680 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.818385 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.818460 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.818481 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.819569 4689 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.819601 4689 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.819643 4689 state_mem.go:36] "Initialized new in-memory state store" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.820944 4689 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.824508 4689 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.824557 4689 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.824598 4689 kubelet.go:2335] "Starting kubelet main sync loop" Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.824789 4689 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 07 04:19:15 crc kubenswrapper[4689]: W0307 04:19:15.827159 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.827279 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.834041 4689 policy_none.go:49] "None policy: Start" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.835395 4689 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.835448 4689 state_mem.go:35] "Initializing new in-memory state store" Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.844714 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.897661 4689 manager.go:334] "Starting Device Plugin manager" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.898014 4689 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.898045 4689 server.go:79] "Starting device plugin registration server" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.898677 4689 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.898704 4689 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.898894 4689 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.899068 4689 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.899092 4689 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.907846 4689 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.925078 4689 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.925266 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.926512 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.926597 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.926628 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.927007 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.927381 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.927482 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.928757 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.928823 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.928837 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.929003 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.929040 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.929047 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.929055 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.929283 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.929351 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.930200 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.930290 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.930344 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.930536 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.930559 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.930563 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.930680 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.930630 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.930754 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.931929 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.931979 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.932002 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.932062 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.932083 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.932099 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.932215 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.932370 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.932434 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.933255 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.933286 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.933301 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.933301 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.933457 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.933483 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.933770 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.933828 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.935672 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.935771 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.935800 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:15 crc kubenswrapper[4689]: E0307 04:19:15.958292 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.984621 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.984696 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.984725 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.984756 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.984786 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.984846 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.984906 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.984986 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.985031 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.985063 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.985101 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.985130 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.985152 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.985206 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.985238 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 04:19:15 crc kubenswrapper[4689]: I0307 04:19:15.999083 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.000309 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.000364 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.000384 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.000423 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:19:16 crc kubenswrapper[4689]: E0307 04:19:16.000959 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086459 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086516 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086546 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086575 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086599 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086623 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086636 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086732 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086728 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086658 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086780 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086782 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086848 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086904 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086857 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086958 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.086990 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087022 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087051 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087079 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087112 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087098 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087144 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087187 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087219 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087211 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087271 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087143 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087342 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.087401 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.201291 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.203001 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.203055 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.203073 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.203109 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:19:16 crc kubenswrapper[4689]: E0307 04:19:16.203586 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.259797 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.266221 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.281533 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.299892 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.307275 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 04:19:16 crc kubenswrapper[4689]: E0307 04:19:16.360292 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Mar 07 04:19:16 crc kubenswrapper[4689]: W0307 04:19:16.360660 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-172d3c5c427c2dab8d8c0d49bc555caa51cd8f1b2adb0cbad341efeeb7e4fad5 WatchSource:0}: Error finding container 172d3c5c427c2dab8d8c0d49bc555caa51cd8f1b2adb0cbad341efeeb7e4fad5: Status 404 returned error can't find the container with id 172d3c5c427c2dab8d8c0d49bc555caa51cd8f1b2adb0cbad341efeeb7e4fad5 Mar 07 04:19:16 crc kubenswrapper[4689]: W0307 04:19:16.364524 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-d1b50ae76df31fa273eeef9a54f0c0ac80c7f6c50987de05a5fbecdc0c2b1c2b WatchSource:0}: Error finding container d1b50ae76df31fa273eeef9a54f0c0ac80c7f6c50987de05a5fbecdc0c2b1c2b: Status 404 returned error can't find the container with id d1b50ae76df31fa273eeef9a54f0c0ac80c7f6c50987de05a5fbecdc0c2b1c2b Mar 07 04:19:16 crc kubenswrapper[4689]: W0307 04:19:16.366076 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-18c835de6763922031808271af5dc828678257f6ff0dd687ed878cdc94b9437b WatchSource:0}: Error finding container 18c835de6763922031808271af5dc828678257f6ff0dd687ed878cdc94b9437b: Status 404 returned error can't find the container with id 18c835de6763922031808271af5dc828678257f6ff0dd687ed878cdc94b9437b Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.604386 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.606504 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.606563 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.606576 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.606610 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:19:16 crc kubenswrapper[4689]: E0307 04:19:16.607277 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.739812 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:16 crc kubenswrapper[4689]: W0307 04:19:16.795641 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:16 crc kubenswrapper[4689]: E0307 04:19:16.795733 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.831521 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"332477a5d4abb6bdd3fe60acecca2f940477809c7adf146ecb0acd22ffd05134"} Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.833231 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"858b9f244a112f94ce768dc8078f59506de2e8d139c79017f8c584fd471128ae"} Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.834357 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"18c835de6763922031808271af5dc828678257f6ff0dd687ed878cdc94b9437b"} Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.835624 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d1b50ae76df31fa273eeef9a54f0c0ac80c7f6c50987de05a5fbecdc0c2b1c2b"} Mar 07 04:19:16 crc kubenswrapper[4689]: I0307 04:19:16.836788 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"172d3c5c427c2dab8d8c0d49bc555caa51cd8f1b2adb0cbad341efeeb7e4fad5"} Mar 07 04:19:16 crc kubenswrapper[4689]: W0307 04:19:16.842397 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:16 crc kubenswrapper[4689]: E0307 04:19:16.842477 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 07 04:19:17 crc kubenswrapper[4689]: W0307 04:19:17.058081 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:17 crc kubenswrapper[4689]: E0307 04:19:17.058212 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 07 04:19:17 crc kubenswrapper[4689]: E0307 04:19:17.161898 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Mar 07 04:19:17 crc kubenswrapper[4689]: W0307 04:19:17.198057 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:17 crc kubenswrapper[4689]: E0307 04:19:17.198150 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.407923 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.409581 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.409635 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.409648 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.409680 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:19:17 crc kubenswrapper[4689]: E0307 04:19:17.410386 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.648022 4689 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 04:19:17 crc kubenswrapper[4689]: E0307 04:19:17.649805 4689 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.740712 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.844151 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24"} Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.844267 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765"} Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.846404 4689 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6" exitCode=0 Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.846855 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.846835 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6"} Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.848906 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.848980 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.849005 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.849027 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a219e6def5b606366506c7fcece2fd17dbe1f595882747c63c3dd3ef2fd35215"} Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.849208 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.848997 4689 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a219e6def5b606366506c7fcece2fd17dbe1f595882747c63c3dd3ef2fd35215" exitCode=0 Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.850674 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.850705 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.850719 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.852040 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.852265 4689 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61" exitCode=0 Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.852353 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61"} Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.852355 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.853823 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.853958 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.854087 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.854233 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.854132 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.854445 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.855388 4689 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f" exitCode=0 Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.855445 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f"} Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.855527 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.857299 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.857472 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:17 crc kubenswrapper[4689]: I0307 04:19:17.857528 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.739482 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:18 crc kubenswrapper[4689]: E0307 04:19:18.763314 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.863308 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10"} Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.863361 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27"} Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.863373 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643"} Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.864625 4689 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4ca75cc4882b20888542f5314b05296ff463431c7dd9ce2fbdd43457ce977f24" exitCode=0 Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.864701 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4ca75cc4882b20888542f5314b05296ff463431c7dd9ce2fbdd43457ce977f24"} Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.864717 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.868736 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.868770 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.868781 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.871001 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"742bbed66a04e7b4f7db672f2ed05a32c1592384661edebf72f01e9b1b7d0eb7"} Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.871043 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.872134 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.872236 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.872256 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.873003 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e3dcba7bfbb1a5097afa4c8643d0fdc845439b5107877e8689daed2072d34e5f"} Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.873056 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c2145120e262bbc6fd4876167d2bc0bd4f23ca467a1ab81f57e8df919c721c18"} Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.873068 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e2c10b5babd421667e41329cf7d752810507842c003e9b0e24c07c59e3e866b3"} Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.874987 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d"} Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.875019 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96"} Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.875044 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.875765 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.875808 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:18 crc kubenswrapper[4689]: I0307 04:19:18.875822 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.011223 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.030923 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.030984 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.030996 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.031024 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:19:19 crc kubenswrapper[4689]: E0307 04:19:19.031532 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Mar 07 04:19:19 crc kubenswrapper[4689]: W0307 04:19:19.617369 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:19 crc kubenswrapper[4689]: E0307 04:19:19.617469 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.740415 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:19 crc kubenswrapper[4689]: W0307 04:19:19.763985 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Mar 07 04:19:19 crc kubenswrapper[4689]: E0307 04:19:19.764056 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.880801 4689 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e21a89292ef00349d6c3d46d78c87a560415567582e50fd0468a9d74b26fb890" exitCode=0 Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.880871 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e21a89292ef00349d6c3d46d78c87a560415567582e50fd0468a9d74b26fb890"} Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.881005 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.881848 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.881876 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.881887 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.890432 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.890925 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.891241 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ce179675be9316692288db78175210e195bfe65d783a6c110a13d28cd550ee49"} Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.891275 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555"} Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.891329 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.891699 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.892585 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.892607 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.892617 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.893146 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.893195 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.893207 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.893692 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.893715 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.893726 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.894209 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.894243 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:19 crc kubenswrapper[4689]: I0307 04:19:19.894252 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:20 crc kubenswrapper[4689]: I0307 04:19:20.282245 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:20 crc kubenswrapper[4689]: I0307 04:19:20.900558 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f5d217372647b44fc920c094888386c68a97b7110fdd444dfc9567985528032d"} Mar 07 04:19:20 crc kubenswrapper[4689]: I0307 04:19:20.900618 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3cf518a0904baf051a85f038f76ab2b10401243edf2984023766b0359501abfd"} Mar 07 04:19:20 crc kubenswrapper[4689]: I0307 04:19:20.900638 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"53b41aef19b25a43b7910cb036ad3a04aa1f6105aa23aae4770c12e0df7b63d3"} Mar 07 04:19:20 crc kubenswrapper[4689]: I0307 04:19:20.900653 4689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 04:19:20 crc kubenswrapper[4689]: I0307 04:19:20.900715 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:20 crc kubenswrapper[4689]: I0307 04:19:20.902343 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:20 crc kubenswrapper[4689]: I0307 04:19:20.902379 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:20 crc kubenswrapper[4689]: I0307 04:19:20.902393 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:21 crc kubenswrapper[4689]: I0307 04:19:21.399387 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:21 crc kubenswrapper[4689]: I0307 04:19:21.718942 4689 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 04:19:21 crc kubenswrapper[4689]: I0307 04:19:21.909988 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f7dcc862bbeb8f9baddde2c48c7d7a63742df61bd8b2eb5d40ff90bfd2ee6733"} Mar 07 04:19:21 crc kubenswrapper[4689]: I0307 04:19:21.910077 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"19433c68acee8873af862c6e4f1c5b8486472e4702dad35a941fc00bb9872252"} Mar 07 04:19:21 crc kubenswrapper[4689]: I0307 04:19:21.910095 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:21 crc kubenswrapper[4689]: I0307 04:19:21.910207 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:21 crc kubenswrapper[4689]: I0307 04:19:21.911476 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:21 crc kubenswrapper[4689]: I0307 04:19:21.911539 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:21 crc kubenswrapper[4689]: I0307 04:19:21.911567 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:21 crc kubenswrapper[4689]: I0307 04:19:21.911589 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:21 crc kubenswrapper[4689]: I0307 04:19:21.911625 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:21 crc kubenswrapper[4689]: I0307 04:19:21.911649 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.006568 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.006829 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.011757 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.011855 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.011871 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.165512 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.232117 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.233709 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.233801 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.233823 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.233881 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.913238 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.913342 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.913384 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.915264 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.915305 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.915318 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.915411 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.915442 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.915498 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.915724 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.915830 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:22 crc kubenswrapper[4689]: I0307 04:19:22.915847 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:23 crc kubenswrapper[4689]: I0307 04:19:23.103455 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:23 crc kubenswrapper[4689]: I0307 04:19:23.749693 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:23 crc kubenswrapper[4689]: I0307 04:19:23.756345 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:23 crc kubenswrapper[4689]: I0307 04:19:23.916885 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:23 crc kubenswrapper[4689]: I0307 04:19:23.916922 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:23 crc kubenswrapper[4689]: I0307 04:19:23.918346 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:23 crc kubenswrapper[4689]: I0307 04:19:23.918351 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:23 crc kubenswrapper[4689]: I0307 04:19:23.918399 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:23 crc kubenswrapper[4689]: I0307 04:19:23.918427 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:23 crc kubenswrapper[4689]: I0307 04:19:23.918444 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:23 crc kubenswrapper[4689]: I0307 04:19:23.918452 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:24 crc kubenswrapper[4689]: I0307 04:19:24.497436 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 07 04:19:24 crc kubenswrapper[4689]: I0307 04:19:24.497727 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:24 crc kubenswrapper[4689]: I0307 04:19:24.499644 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:24 crc kubenswrapper[4689]: I0307 04:19:24.499719 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:24 crc kubenswrapper[4689]: I0307 04:19:24.499740 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:24 crc kubenswrapper[4689]: I0307 04:19:24.919822 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:24 crc kubenswrapper[4689]: I0307 04:19:24.921112 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:24 crc kubenswrapper[4689]: I0307 04:19:24.921239 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:24 crc kubenswrapper[4689]: I0307 04:19:24.921260 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:25 crc kubenswrapper[4689]: I0307 04:19:25.115885 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 04:19:25 crc kubenswrapper[4689]: I0307 04:19:25.116294 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:25 crc kubenswrapper[4689]: I0307 04:19:25.118001 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:25 crc kubenswrapper[4689]: I0307 04:19:25.118065 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:25 crc kubenswrapper[4689]: I0307 04:19:25.118083 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:25 crc kubenswrapper[4689]: I0307 04:19:25.519827 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 07 04:19:25 crc kubenswrapper[4689]: I0307 04:19:25.520156 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:25 crc kubenswrapper[4689]: I0307 04:19:25.521939 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:25 crc kubenswrapper[4689]: I0307 04:19:25.522029 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:25 crc kubenswrapper[4689]: I0307 04:19:25.522072 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:25 crc kubenswrapper[4689]: E0307 04:19:25.907981 4689 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 04:19:27 crc kubenswrapper[4689]: I0307 04:19:27.697877 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:27 crc kubenswrapper[4689]: I0307 04:19:27.698485 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:27 crc kubenswrapper[4689]: I0307 04:19:27.700350 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:27 crc kubenswrapper[4689]: I0307 04:19:27.700414 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:27 crc kubenswrapper[4689]: I0307 04:19:27.700435 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:27 crc kubenswrapper[4689]: I0307 04:19:27.705320 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:27 crc kubenswrapper[4689]: I0307 04:19:27.927994 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:27 crc kubenswrapper[4689]: I0307 04:19:27.930377 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:27 crc kubenswrapper[4689]: I0307 04:19:27.930450 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:27 crc kubenswrapper[4689]: I0307 04:19:27.930464 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:30 crc kubenswrapper[4689]: W0307 04:19:30.216451 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.216591 4689 trace.go:236] Trace[738824336]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Mar-2026 04:19:20.214) (total time: 10002ms): Mar 07 04:19:30 crc kubenswrapper[4689]: Trace[738824336]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:19:30.216) Mar 07 04:19:30 crc kubenswrapper[4689]: Trace[738824336]: [10.002023977s] [10.002023977s] END Mar 07 04:19:30 crc kubenswrapper[4689]: E0307 04:19:30.216623 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 07 04:19:30 crc kubenswrapper[4689]: W0307 04:19:30.266608 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.266738 4689 trace.go:236] Trace[349717813]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Mar-2026 04:19:20.264) (total time: 10002ms): Mar 07 04:19:30 crc kubenswrapper[4689]: Trace[349717813]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (04:19:30.266) Mar 07 04:19:30 crc kubenswrapper[4689]: Trace[349717813]: [10.002350167s] [10.002350167s] END Mar 07 04:19:30 crc kubenswrapper[4689]: E0307 04:19:30.266773 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.283191 4689 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" start-of-body= Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.283258 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" Mar 07 04:19:30 crc kubenswrapper[4689]: E0307 04:19:30.546321 4689 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 04:19:30 crc kubenswrapper[4689]: W0307 04:19:30.548336 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:30Z is after 2026-02-23T05:33:13Z Mar 07 04:19:30 crc kubenswrapper[4689]: E0307 04:19:30.548411 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 04:19:30 crc kubenswrapper[4689]: E0307 04:19:30.553082 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:30Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 07 04:19:30 crc kubenswrapper[4689]: W0307 04:19:30.555236 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:30Z is after 2026-02-23T05:33:13Z Mar 07 04:19:30 crc kubenswrapper[4689]: E0307 04:19:30.555343 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 04:19:30 crc kubenswrapper[4689]: E0307 04:19:30.557253 4689 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189a7430ad4d00fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.738468605 +0000 UTC m=+0.784852104,LastTimestamp:2026-03-07 04:19:15.738468605 +0000 UTC m=+0.784852104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.557506 4689 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.557582 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.560303 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:30Z is after 2026-02-23T05:33:13Z Mar 07 04:19:30 crc kubenswrapper[4689]: E0307 04:19:30.564837 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:30Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.698282 4689 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.698426 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.742429 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:30Z is after 2026-02-23T05:33:13Z Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.939042 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.941716 4689 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ce179675be9316692288db78175210e195bfe65d783a6c110a13d28cd550ee49" exitCode=255 Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.941802 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ce179675be9316692288db78175210e195bfe65d783a6c110a13d28cd550ee49"} Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.942078 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.943398 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.943456 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.943483 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:30 crc kubenswrapper[4689]: I0307 04:19:30.944851 4689 scope.go:117] "RemoveContainer" containerID="ce179675be9316692288db78175210e195bfe65d783a6c110a13d28cd550ee49" Mar 07 04:19:31 crc kubenswrapper[4689]: I0307 04:19:31.742211 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:31Z is after 2026-02-23T05:33:13Z Mar 07 04:19:31 crc kubenswrapper[4689]: I0307 04:19:31.947439 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 07 04:19:31 crc kubenswrapper[4689]: I0307 04:19:31.948452 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 07 04:19:31 crc kubenswrapper[4689]: I0307 04:19:31.950881 4689 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2fe82f712df7331df8fddb61a0fea2ebb06cf722c5ca6a40e1adab082aa42912" exitCode=255 Mar 07 04:19:31 crc kubenswrapper[4689]: I0307 04:19:31.950955 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2fe82f712df7331df8fddb61a0fea2ebb06cf722c5ca6a40e1adab082aa42912"} Mar 07 04:19:31 crc kubenswrapper[4689]: I0307 04:19:31.951037 4689 scope.go:117] "RemoveContainer" containerID="ce179675be9316692288db78175210e195bfe65d783a6c110a13d28cd550ee49" Mar 07 04:19:31 crc kubenswrapper[4689]: I0307 04:19:31.951242 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:31 crc kubenswrapper[4689]: I0307 04:19:31.952398 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:31 crc kubenswrapper[4689]: I0307 04:19:31.952480 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:31 crc kubenswrapper[4689]: I0307 04:19:31.952507 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:31 crc kubenswrapper[4689]: I0307 04:19:31.953850 4689 scope.go:117] "RemoveContainer" containerID="2fe82f712df7331df8fddb61a0fea2ebb06cf722c5ca6a40e1adab082aa42912" Mar 07 04:19:31 crc kubenswrapper[4689]: E0307 04:19:31.954301 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:19:32 crc kubenswrapper[4689]: I0307 04:19:32.745410 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:32Z is after 2026-02-23T05:33:13Z Mar 07 04:19:32 crc kubenswrapper[4689]: I0307 04:19:32.955553 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 07 04:19:33 crc kubenswrapper[4689]: I0307 04:19:33.744075 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:33Z is after 2026-02-23T05:33:13Z Mar 07 04:19:34 crc kubenswrapper[4689]: W0307 04:19:34.402562 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:34Z is after 2026-02-23T05:33:13Z Mar 07 04:19:34 crc kubenswrapper[4689]: E0307 04:19:34.402708 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 04:19:34 crc kubenswrapper[4689]: I0307 04:19:34.744114 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:34Z is after 2026-02-23T05:33:13Z Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.291448 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.291720 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.293114 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.293147 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.293160 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.293704 4689 scope.go:117] "RemoveContainer" containerID="2fe82f712df7331df8fddb61a0fea2ebb06cf722c5ca6a40e1adab082aa42912" Mar 07 04:19:35 crc kubenswrapper[4689]: E0307 04:19:35.293856 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.298600 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.461861 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:35 crc kubenswrapper[4689]: W0307 04:19:35.524784 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:35Z is after 2026-02-23T05:33:13Z Mar 07 04:19:35 crc kubenswrapper[4689]: E0307 04:19:35.524914 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.553846 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.554212 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.555843 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.555907 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.555949 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.568727 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.745373 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:35Z is after 2026-02-23T05:33:13Z Mar 07 04:19:35 crc kubenswrapper[4689]: E0307 04:19:35.908160 4689 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.965970 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.966136 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.967602 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.967641 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.967655 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.968222 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.968285 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.968303 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:35 crc kubenswrapper[4689]: I0307 04:19:35.969270 4689 scope.go:117] "RemoveContainer" containerID="2fe82f712df7331df8fddb61a0fea2ebb06cf722c5ca6a40e1adab082aa42912" Mar 07 04:19:35 crc kubenswrapper[4689]: E0307 04:19:35.969560 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:19:36 crc kubenswrapper[4689]: I0307 04:19:36.744742 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:36Z is after 2026-02-23T05:33:13Z Mar 07 04:19:36 crc kubenswrapper[4689]: E0307 04:19:36.958707 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:36Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 07 04:19:36 crc kubenswrapper[4689]: I0307 04:19:36.965918 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:36 crc kubenswrapper[4689]: I0307 04:19:36.968418 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:36 crc kubenswrapper[4689]: I0307 04:19:36.968491 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:36 crc kubenswrapper[4689]: I0307 04:19:36.968515 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:36 crc kubenswrapper[4689]: I0307 04:19:36.968550 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:36 crc kubenswrapper[4689]: I0307 04:19:36.968558 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:19:36 crc kubenswrapper[4689]: I0307 04:19:36.970528 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:36 crc kubenswrapper[4689]: I0307 04:19:36.970614 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:36 crc kubenswrapper[4689]: I0307 04:19:36.970685 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:36 crc kubenswrapper[4689]: I0307 04:19:36.971563 4689 scope.go:117] "RemoveContainer" containerID="2fe82f712df7331df8fddb61a0fea2ebb06cf722c5ca6a40e1adab082aa42912" Mar 07 04:19:36 crc kubenswrapper[4689]: E0307 04:19:36.971846 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:19:36 crc kubenswrapper[4689]: E0307 04:19:36.975242 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:36Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 04:19:37 crc kubenswrapper[4689]: I0307 04:19:37.742975 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:37Z is after 2026-02-23T05:33:13Z Mar 07 04:19:38 crc kubenswrapper[4689]: I0307 04:19:38.743514 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:38Z is after 2026-02-23T05:33:13Z Mar 07 04:19:38 crc kubenswrapper[4689]: I0307 04:19:38.979687 4689 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 04:19:38 crc kubenswrapper[4689]: E0307 04:19:38.985443 4689 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 04:19:39 crc kubenswrapper[4689]: I0307 04:19:39.742841 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:39Z is after 2026-02-23T05:33:13Z Mar 07 04:19:40 crc kubenswrapper[4689]: E0307 04:19:40.563612 4689 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189a7430ad4d00fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.738468605 +0000 UTC m=+0.784852104,LastTimestamp:2026-03-07 04:19:15.738468605 +0000 UTC m=+0.784852104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:19:40 crc kubenswrapper[4689]: I0307 04:19:40.697826 4689 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 04:19:40 crc kubenswrapper[4689]: I0307 04:19:40.697915 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 04:19:40 crc kubenswrapper[4689]: I0307 04:19:40.743551 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:40Z is after 2026-02-23T05:33:13Z Mar 07 04:19:41 crc kubenswrapper[4689]: I0307 04:19:41.399740 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:19:41 crc kubenswrapper[4689]: I0307 04:19:41.400020 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:41 crc kubenswrapper[4689]: I0307 04:19:41.401758 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:41 crc kubenswrapper[4689]: I0307 04:19:41.401831 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:41 crc kubenswrapper[4689]: I0307 04:19:41.401857 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:41 crc kubenswrapper[4689]: I0307 04:19:41.402895 4689 scope.go:117] "RemoveContainer" containerID="2fe82f712df7331df8fddb61a0fea2ebb06cf722c5ca6a40e1adab082aa42912" Mar 07 04:19:41 crc kubenswrapper[4689]: E0307 04:19:41.403305 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:19:41 crc kubenswrapper[4689]: W0307 04:19:41.539462 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:41Z is after 2026-02-23T05:33:13Z Mar 07 04:19:41 crc kubenswrapper[4689]: E0307 04:19:41.539643 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 04:19:41 crc kubenswrapper[4689]: I0307 04:19:41.744238 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:41Z is after 2026-02-23T05:33:13Z Mar 07 04:19:42 crc kubenswrapper[4689]: I0307 04:19:42.742921 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:42Z is after 2026-02-23T05:33:13Z Mar 07 04:19:43 crc kubenswrapper[4689]: W0307 04:19:43.319982 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:43Z is after 2026-02-23T05:33:13Z Mar 07 04:19:43 crc kubenswrapper[4689]: E0307 04:19:43.320111 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 04:19:43 crc kubenswrapper[4689]: I0307 04:19:43.746156 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:43Z is after 2026-02-23T05:33:13Z Mar 07 04:19:43 crc kubenswrapper[4689]: E0307 04:19:43.964090 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:43Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 07 04:19:43 crc kubenswrapper[4689]: I0307 04:19:43.976344 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:43 crc kubenswrapper[4689]: I0307 04:19:43.978537 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:43 crc kubenswrapper[4689]: I0307 04:19:43.978750 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:43 crc kubenswrapper[4689]: I0307 04:19:43.978932 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:43 crc kubenswrapper[4689]: I0307 04:19:43.979072 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:19:43 crc kubenswrapper[4689]: E0307 04:19:43.984102 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:43Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 04:19:44 crc kubenswrapper[4689]: I0307 04:19:44.745158 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:44Z is after 2026-02-23T05:33:13Z Mar 07 04:19:45 crc kubenswrapper[4689]: I0307 04:19:45.745357 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:45Z is after 2026-02-23T05:33:13Z Mar 07 04:19:45 crc kubenswrapper[4689]: E0307 04:19:45.908448 4689 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 04:19:46 crc kubenswrapper[4689]: W0307 04:19:46.702143 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:46Z is after 2026-02-23T05:33:13Z Mar 07 04:19:46 crc kubenswrapper[4689]: E0307 04:19:46.702240 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 04:19:46 crc kubenswrapper[4689]: I0307 04:19:46.744520 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:46Z is after 2026-02-23T05:33:13Z Mar 07 04:19:47 crc kubenswrapper[4689]: I0307 04:19:47.745266 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z Mar 07 04:19:48 crc kubenswrapper[4689]: W0307 04:19:48.191614 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:48Z is after 2026-02-23T05:33:13Z Mar 07 04:19:48 crc kubenswrapper[4689]: E0307 04:19:48.191760 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 04:19:48 crc kubenswrapper[4689]: I0307 04:19:48.743268 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:48Z is after 2026-02-23T05:33:13Z Mar 07 04:19:48 crc kubenswrapper[4689]: I0307 04:19:48.835095 4689 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:38854->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 07 04:19:48 crc kubenswrapper[4689]: I0307 04:19:48.835207 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:38854->192.168.126.11:10357: read: connection reset by peer" Mar 07 04:19:48 crc kubenswrapper[4689]: I0307 04:19:48.835289 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:48 crc kubenswrapper[4689]: I0307 04:19:48.835503 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:48 crc kubenswrapper[4689]: I0307 04:19:48.837250 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:48 crc kubenswrapper[4689]: I0307 04:19:48.837323 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:48 crc kubenswrapper[4689]: I0307 04:19:48.837344 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:48 crc kubenswrapper[4689]: I0307 04:19:48.838067 4689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 07 04:19:48 crc kubenswrapper[4689]: I0307 04:19:48.838307 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24" gracePeriod=30 Mar 07 04:19:49 crc kubenswrapper[4689]: I0307 04:19:49.007600 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 07 04:19:49 crc kubenswrapper[4689]: I0307 04:19:49.008238 4689 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24" exitCode=255 Mar 07 04:19:49 crc kubenswrapper[4689]: I0307 04:19:49.008318 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24"} Mar 07 04:19:49 crc kubenswrapper[4689]: I0307 04:19:49.744850 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:49Z is after 2026-02-23T05:33:13Z Mar 07 04:19:50 crc kubenswrapper[4689]: I0307 04:19:50.014871 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 07 04:19:50 crc kubenswrapper[4689]: I0307 04:19:50.015554 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82"} Mar 07 04:19:50 crc kubenswrapper[4689]: I0307 04:19:50.015780 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:50 crc kubenswrapper[4689]: I0307 04:19:50.017652 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:50 crc kubenswrapper[4689]: I0307 04:19:50.017721 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:50 crc kubenswrapper[4689]: I0307 04:19:50.017737 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:50 crc kubenswrapper[4689]: E0307 04:19:50.569799 4689 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189a7430ad4d00fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.738468605 +0000 UTC m=+0.784852104,LastTimestamp:2026-03-07 04:19:15.738468605 +0000 UTC m=+0.784852104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:19:50 crc kubenswrapper[4689]: I0307 04:19:50.744752 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:50Z is after 2026-02-23T05:33:13Z Mar 07 04:19:50 crc kubenswrapper[4689]: E0307 04:19:50.970306 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:50Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 07 04:19:50 crc kubenswrapper[4689]: I0307 04:19:50.984773 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:50 crc kubenswrapper[4689]: I0307 04:19:50.986584 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:50 crc kubenswrapper[4689]: I0307 04:19:50.986637 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:50 crc kubenswrapper[4689]: I0307 04:19:50.986656 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:50 crc kubenswrapper[4689]: I0307 04:19:50.986723 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:19:50 crc kubenswrapper[4689]: E0307 04:19:50.990045 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:50Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 04:19:51 crc kubenswrapper[4689]: I0307 04:19:51.018978 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:51 crc kubenswrapper[4689]: I0307 04:19:51.019989 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:51 crc kubenswrapper[4689]: I0307 04:19:51.020037 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:51 crc kubenswrapper[4689]: I0307 04:19:51.020054 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:51 crc kubenswrapper[4689]: I0307 04:19:51.745906 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:51Z is after 2026-02-23T05:33:13Z Mar 07 04:19:52 crc kubenswrapper[4689]: I0307 04:19:52.007622 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:52 crc kubenswrapper[4689]: I0307 04:19:52.022835 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:52 crc kubenswrapper[4689]: I0307 04:19:52.027450 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:52 crc kubenswrapper[4689]: I0307 04:19:52.027548 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:52 crc kubenswrapper[4689]: I0307 04:19:52.027582 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:52 crc kubenswrapper[4689]: I0307 04:19:52.743385 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:52Z is after 2026-02-23T05:33:13Z Mar 07 04:19:53 crc kubenswrapper[4689]: I0307 04:19:53.742666 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:53Z is after 2026-02-23T05:33:13Z Mar 07 04:19:54 crc kubenswrapper[4689]: I0307 04:19:54.744562 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:54Z is after 2026-02-23T05:33:13Z Mar 07 04:19:54 crc kubenswrapper[4689]: I0307 04:19:54.825272 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:54 crc kubenswrapper[4689]: I0307 04:19:54.826900 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:54 crc kubenswrapper[4689]: I0307 04:19:54.826953 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:54 crc kubenswrapper[4689]: I0307 04:19:54.826972 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:54 crc kubenswrapper[4689]: I0307 04:19:54.828216 4689 scope.go:117] "RemoveContainer" containerID="2fe82f712df7331df8fddb61a0fea2ebb06cf722c5ca6a40e1adab082aa42912" Mar 07 04:19:55 crc kubenswrapper[4689]: I0307 04:19:55.745214 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:55Z is after 2026-02-23T05:33:13Z Mar 07 04:19:55 crc kubenswrapper[4689]: E0307 04:19:55.908708 4689 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 04:19:56 crc kubenswrapper[4689]: I0307 04:19:56.033059 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 07 04:19:56 crc kubenswrapper[4689]: I0307 04:19:56.033931 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 07 04:19:56 crc kubenswrapper[4689]: I0307 04:19:56.035291 4689 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a20b50dd92782d6f089871eaa4c9604e2ac0a53ef15148842e8b28e0c319c808" exitCode=255 Mar 07 04:19:56 crc kubenswrapper[4689]: I0307 04:19:56.035401 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a20b50dd92782d6f089871eaa4c9604e2ac0a53ef15148842e8b28e0c319c808"} Mar 07 04:19:56 crc kubenswrapper[4689]: I0307 04:19:56.035507 4689 scope.go:117] "RemoveContainer" containerID="2fe82f712df7331df8fddb61a0fea2ebb06cf722c5ca6a40e1adab082aa42912" Mar 07 04:19:56 crc kubenswrapper[4689]: I0307 04:19:56.035733 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:56 crc kubenswrapper[4689]: I0307 04:19:56.041887 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:56 crc kubenswrapper[4689]: I0307 04:19:56.041955 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:56 crc kubenswrapper[4689]: I0307 04:19:56.041986 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:56 crc kubenswrapper[4689]: I0307 04:19:56.043741 4689 scope.go:117] "RemoveContainer" containerID="a20b50dd92782d6f089871eaa4c9604e2ac0a53ef15148842e8b28e0c319c808" Mar 07 04:19:56 crc kubenswrapper[4689]: E0307 04:19:56.044043 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:19:56 crc kubenswrapper[4689]: I0307 04:19:56.099744 4689 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 04:19:56 crc kubenswrapper[4689]: E0307 04:19:56.106656 4689 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 04:19:56 crc kubenswrapper[4689]: E0307 04:19:56.107905 4689 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 07 04:19:56 crc kubenswrapper[4689]: I0307 04:19:56.745786 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:56Z is after 2026-02-23T05:33:13Z Mar 07 04:19:57 crc kubenswrapper[4689]: I0307 04:19:57.043374 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 07 04:19:57 crc kubenswrapper[4689]: I0307 04:19:57.703662 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:19:57 crc kubenswrapper[4689]: I0307 04:19:57.703896 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:57 crc kubenswrapper[4689]: I0307 04:19:57.705546 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:57 crc kubenswrapper[4689]: I0307 04:19:57.705622 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:57 crc kubenswrapper[4689]: I0307 04:19:57.705636 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:57 crc kubenswrapper[4689]: I0307 04:19:57.744561 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:57Z is after 2026-02-23T05:33:13Z Mar 07 04:19:57 crc kubenswrapper[4689]: E0307 04:19:57.976506 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:57Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 07 04:19:57 crc kubenswrapper[4689]: I0307 04:19:57.990891 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:19:57 crc kubenswrapper[4689]: I0307 04:19:57.992726 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:19:57 crc kubenswrapper[4689]: I0307 04:19:57.992816 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:19:57 crc kubenswrapper[4689]: I0307 04:19:57.992845 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:19:57 crc kubenswrapper[4689]: I0307 04:19:57.992899 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:19:57 crc kubenswrapper[4689]: E0307 04:19:57.997982 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:57Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 04:19:58 crc kubenswrapper[4689]: I0307 04:19:58.745365 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:58Z is after 2026-02-23T05:33:13Z Mar 07 04:19:59 crc kubenswrapper[4689]: I0307 04:19:59.746714 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.579393 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430ad4d00fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.738468605 +0000 UTC m=+0.784852104,LastTimestamp:2026-03-07 04:19:15.738468605 +0000 UTC m=+0.784852104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.586360 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2115b58 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818445656 +0000 UTC m=+0.864829165,LastTimestamp:2026-03-07 04:19:15.818445656 +0000 UTC m=+0.864829165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.592513 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b211c733 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818473267 +0000 UTC m=+0.864856766,LastTimestamp:2026-03-07 04:19:15.818473267 +0000 UTC m=+0.864856766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.599669 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2120f9f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818491807 +0000 UTC m=+0.864875316,LastTimestamp:2026-03-07 04:19:15.818491807 +0000 UTC m=+0.864875316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.607646 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b6fe16c5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.901068997 +0000 UTC m=+0.947452506,LastTimestamp:2026-03-07 04:19:15.901068997 +0000 UTC m=+0.947452506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.614481 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2115b58\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2115b58 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818445656 +0000 UTC m=+0.864829165,LastTimestamp:2026-03-07 04:19:15.92656754 +0000 UTC m=+0.972951079,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.622969 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b211c733\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b211c733 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818473267 +0000 UTC m=+0.864856766,LastTimestamp:2026-03-07 04:19:15.92661429 +0000 UTC m=+0.972997830,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.630294 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2120f9f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2120f9f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818491807 +0000 UTC m=+0.864875316,LastTimestamp:2026-03-07 04:19:15.926642131 +0000 UTC m=+0.973025670,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.638682 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2115b58\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2115b58 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818445656 +0000 UTC m=+0.864829165,LastTimestamp:2026-03-07 04:19:15.928800104 +0000 UTC m=+0.975183613,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.648040 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b211c733\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b211c733 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818473267 +0000 UTC m=+0.864856766,LastTimestamp:2026-03-07 04:19:15.928831424 +0000 UTC m=+0.975214923,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.654993 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2120f9f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2120f9f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818491807 +0000 UTC m=+0.864875316,LastTimestamp:2026-03-07 04:19:15.928844344 +0000 UTC m=+0.975227843,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.662272 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2115b58\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2115b58 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818445656 +0000 UTC m=+0.864829165,LastTimestamp:2026-03-07 04:19:15.929028437 +0000 UTC m=+0.975411936,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.670709 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b211c733\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b211c733 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818473267 +0000 UTC m=+0.864856766,LastTimestamp:2026-03-07 04:19:15.929049777 +0000 UTC m=+0.975433276,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.679444 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2120f9f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2120f9f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818491807 +0000 UTC m=+0.864875316,LastTimestamp:2026-03-07 04:19:15.929065948 +0000 UTC m=+0.975449437,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.689673 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2115b58\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2115b58 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818445656 +0000 UTC m=+0.864829165,LastTimestamp:2026-03-07 04:19:15.930264906 +0000 UTC m=+0.976648435,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.697531 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b211c733\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b211c733 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818473267 +0000 UTC m=+0.864856766,LastTimestamp:2026-03-07 04:19:15.930306606 +0000 UTC m=+0.976690135,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: I0307 04:20:00.704515 4689 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 04:20:00 crc kubenswrapper[4689]: I0307 04:20:00.704586 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.705135 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2120f9f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2120f9f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818491807 +0000 UTC m=+0.864875316,LastTimestamp:2026-03-07 04:19:15.930359117 +0000 UTC m=+0.976742646,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.712133 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2115b58\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2115b58 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818445656 +0000 UTC m=+0.864829165,LastTimestamp:2026-03-07 04:19:15.93055645 +0000 UTC m=+0.976939949,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.718109 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b211c733\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b211c733 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818473267 +0000 UTC m=+0.864856766,LastTimestamp:2026-03-07 04:19:15.930655151 +0000 UTC m=+0.977038690,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.725418 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2120f9f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2120f9f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818491807 +0000 UTC m=+0.864875316,LastTimestamp:2026-03-07 04:19:15.930699282 +0000 UTC m=+0.977082821,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.732368 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2115b58\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2115b58 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818445656 +0000 UTC m=+0.864829165,LastTimestamp:2026-03-07 04:19:15.931966211 +0000 UTC m=+0.978349740,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.738572 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b211c733\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b211c733 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818473267 +0000 UTC m=+0.864856766,LastTimestamp:2026-03-07 04:19:15.931993331 +0000 UTC m=+0.978376860,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: I0307 04:20:00.746704 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.747294 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2120f9f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2120f9f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818491807 +0000 UTC m=+0.864875316,LastTimestamp:2026-03-07 04:19:15.932014922 +0000 UTC m=+0.978398451,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.755160 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b2115b58\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b2115b58 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818445656 +0000 UTC m=+0.864829165,LastTimestamp:2026-03-07 04:19:15.932075983 +0000 UTC m=+0.978459482,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.763979 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7430b211c733\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7430b211c733 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:15.818473267 +0000 UTC m=+0.864856766,LastTimestamp:2026-03-07 04:19:15.932090213 +0000 UTC m=+0.978473712,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.770716 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7430d345b041 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:16.375523393 +0000 UTC m=+1.421906882,LastTimestamp:2026-03-07 04:19:16.375523393 +0000 UTC m=+1.421906882,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.778157 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7430d3486f96 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:16.375703446 +0000 UTC m=+1.422086975,LastTimestamp:2026-03-07 04:19:16.375703446 +0000 UTC m=+1.422086975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.786513 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7430d348f3b2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:16.375737266 +0000 UTC m=+1.422120775,LastTimestamp:2026-03-07 04:19:16.375737266 +0000 UTC m=+1.422120775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.793502 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7430d3780e28 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:16.378824232 +0000 UTC m=+1.425207721,LastTimestamp:2026-03-07 04:19:16.378824232 +0000 UTC m=+1.425207721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.800138 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7430d3bc891f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:16.383312159 +0000 UTC m=+1.429695688,LastTimestamp:2026-03-07 04:19:16.383312159 +0000 UTC m=+1.429695688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.807459 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7431025c10d2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.165519058 +0000 UTC m=+2.211902587,LastTimestamp:2026-03-07 04:19:17.165519058 +0000 UTC m=+2.211902587,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.813272 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431026ebee2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.166743266 +0000 UTC m=+2.213126795,LastTimestamp:2026-03-07 04:19:17.166743266 +0000 UTC m=+2.213126795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.818079 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7431026fe093 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.166817427 +0000 UTC m=+2.213200936,LastTimestamp:2026-03-07 04:19:17.166817427 +0000 UTC m=+2.213200936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.824801 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7431027061ae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.166850478 +0000 UTC m=+2.213233967,LastTimestamp:2026-03-07 04:19:17.166850478 +0000 UTC m=+2.213233967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.832606 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7431027061c2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.166850498 +0000 UTC m=+2.213233987,LastTimestamp:2026-03-07 04:19:17.166850498 +0000 UTC m=+2.213233987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.835968 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a743103acb05f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.187579999 +0000 UTC m=+2.233963498,LastTimestamp:2026-03-07 04:19:17.187579999 +0000 UTC m=+2.233963498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.841024 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a743103d2995c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.190064476 +0000 UTC m=+2.236447965,LastTimestamp:2026-03-07 04:19:17.190064476 +0000 UTC m=+2.236447965,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.845968 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a743103d6e2d7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.190345431 +0000 UTC m=+2.236728920,LastTimestamp:2026-03-07 04:19:17.190345431 +0000 UTC m=+2.236728920,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.852730 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a743103dc34f8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.190694136 +0000 UTC m=+2.237077625,LastTimestamp:2026-03-07 04:19:17.190694136 +0000 UTC m=+2.237077625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.860408 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a743103e1b095 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.191053461 +0000 UTC m=+2.237436950,LastTimestamp:2026-03-07 04:19:17.191053461 +0000 UTC m=+2.237436950,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.868420 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a743103f956b2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.192603314 +0000 UTC m=+2.238986813,LastTimestamp:2026-03-07 04:19:17.192603314 +0000 UTC m=+2.238986813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.876924 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a74311bc30a15 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.591697941 +0000 UTC m=+2.638081470,LastTimestamp:2026-03-07 04:19:17.591697941 +0000 UTC m=+2.638081470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.883730 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a74311d53e7f4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.61796914 +0000 UTC m=+2.664352669,LastTimestamp:2026-03-07 04:19:17.61796914 +0000 UTC m=+2.664352669,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.891484 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a74311d75121e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.620142622 +0000 UTC m=+2.666526151,LastTimestamp:2026-03-07 04:19:17.620142622 +0000 UTC m=+2.666526151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.900694 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74312b3fd053 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.851533395 +0000 UTC m=+2.897916934,LastTimestamp:2026-03-07 04:19:17.851533395 +0000 UTC m=+2.897916934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.906992 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a74312b6229cf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.853784527 +0000 UTC m=+2.900168046,LastTimestamp:2026-03-07 04:19:17.853784527 +0000 UTC m=+2.900168046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.914620 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a74312b905c67 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.856812135 +0000 UTC m=+2.903195644,LastTimestamp:2026-03-07 04:19:17.856812135 +0000 UTC m=+2.903195644,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.921013 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a74312bb41b29 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.859154729 +0000 UTC m=+2.905538218,LastTimestamp:2026-03-07 04:19:17.859154729 +0000 UTC m=+2.905538218,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.928837 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a74312d39d7cd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.884696525 +0000 UTC m=+2.931080024,LastTimestamp:2026-03-07 04:19:17.884696525 +0000 UTC m=+2.931080024,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.934812 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a74312ede1ef1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.912239857 +0000 UTC m=+2.958623356,LastTimestamp:2026-03-07 04:19:17.912239857 +0000 UTC m=+2.958623356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.941444 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a74312ef273d9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.913572313 +0000 UTC m=+2.959955812,LastTimestamp:2026-03-07 04:19:17.913572313 +0000 UTC m=+2.959955812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.950057 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a74313fad7485 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.194263173 +0000 UTC m=+3.240646672,LastTimestamp:2026-03-07 04:19:18.194263173 +0000 UTC m=+3.240646672,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.957729 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74313fb436e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.194706151 +0000 UTC m=+3.241089650,LastTimestamp:2026-03-07 04:19:18.194706151 +0000 UTC m=+3.241089650,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.965611 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a743141354334 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.21994066 +0000 UTC m=+3.266324149,LastTimestamp:2026-03-07 04:19:18.21994066 +0000 UTC m=+3.266324149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.971375 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a74314157a23d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.222193213 +0000 UTC m=+3.268576702,LastTimestamp:2026-03-07 04:19:18.222193213 +0000 UTC m=+3.268576702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.978838 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a74314159aff3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.222327795 +0000 UTC m=+3.268711284,LastTimestamp:2026-03-07 04:19:18.222327795 +0000 UTC m=+3.268711284,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.986754 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a743141b4ba18 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.228294168 +0000 UTC m=+3.274677657,LastTimestamp:2026-03-07 04:19:18.228294168 +0000 UTC m=+3.274677657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.991383 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7431426428d4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.239791316 +0000 UTC m=+3.286174795,LastTimestamp:2026-03-07 04:19:18.239791316 +0000 UTC m=+3.286174795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:00 crc kubenswrapper[4689]: E0307 04:20:00.998613 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7431428378b0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.241843376 +0000 UTC m=+3.288226865,LastTimestamp:2026-03-07 04:19:18.241843376 +0000 UTC m=+3.288226865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.005899 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7431473951e0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.320869856 +0000 UTC m=+3.367253345,LastTimestamp:2026-03-07 04:19:18.320869856 +0000 UTC m=+3.367253345,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.013286 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a743147c1cd53 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.329814355 +0000 UTC m=+3.376197844,LastTimestamp:2026-03-07 04:19:18.329814355 +0000 UTC m=+3.376197844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.020474 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a74314865e0be openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.34056723 +0000 UTC m=+3.386950719,LastTimestamp:2026-03-07 04:19:18.34056723 +0000 UTC m=+3.386950719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.030553 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7431487a1773 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.341891955 +0000 UTC m=+3.388275484,LastTimestamp:2026-03-07 04:19:18.341891955 +0000 UTC m=+3.388275484,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.036244 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74314f781ae1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.459202273 +0000 UTC m=+3.505585762,LastTimestamp:2026-03-07 04:19:18.459202273 +0000 UTC m=+3.505585762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.041556 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a743150b2418a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.479790474 +0000 UTC m=+3.526173973,LastTimestamp:2026-03-07 04:19:18.479790474 +0000 UTC m=+3.526173973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.045796 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a743150e8369d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.483326621 +0000 UTC m=+3.529710290,LastTimestamp:2026-03-07 04:19:18.483326621 +0000 UTC m=+3.529710290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.055819 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a743156f4efc4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.584823748 +0000 UTC m=+3.631207237,LastTimestamp:2026-03-07 04:19:18.584823748 +0000 UTC m=+3.631207237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.061278 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7431583a8669 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.606161513 +0000 UTC m=+3.652544992,LastTimestamp:2026-03-07 04:19:18.606161513 +0000 UTC m=+3.652544992,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.065503 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7431584f3597 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.607517079 +0000 UTC m=+3.653900558,LastTimestamp:2026-03-07 04:19:18.607517079 +0000 UTC m=+3.653900558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.070219 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74315d5716c9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.691919561 +0000 UTC m=+3.738303050,LastTimestamp:2026-03-07 04:19:18.691919561 +0000 UTC m=+3.738303050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.074012 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74315f3b1344 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.723638084 +0000 UTC m=+3.770021573,LastTimestamp:2026-03-07 04:19:18.723638084 +0000 UTC m=+3.770021573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.077856 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74315f4e6fb0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.724906928 +0000 UTC m=+3.771290407,LastTimestamp:2026-03-07 04:19:18.724906928 +0000 UTC m=+3.771290407,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.081915 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a743165d786c9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.834554569 +0000 UTC m=+3.880938058,LastTimestamp:2026-03-07 04:19:18.834554569 +0000 UTC m=+3.880938058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.086521 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a743167f53fd1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.870056913 +0000 UTC m=+3.916440402,LastTimestamp:2026-03-07 04:19:18.870056913 +0000 UTC m=+3.916440402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.092231 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a743168aaf6ca openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.88196577 +0000 UTC m=+3.928349259,LastTimestamp:2026-03-07 04:19:18.88196577 +0000 UTC m=+3.928349259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.096824 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74316c1acbcc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.939626444 +0000 UTC m=+3.986009933,LastTimestamp:2026-03-07 04:19:18.939626444 +0000 UTC m=+3.986009933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.101582 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74316d7b57dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.962730973 +0000 UTC m=+4.009114462,LastTimestamp:2026-03-07 04:19:18.962730973 +0000 UTC m=+4.009114462,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.106308 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74316d8f5e7c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.964043388 +0000 UTC m=+4.010426877,LastTimestamp:2026-03-07 04:19:18.964043388 +0000 UTC m=+4.010426877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.111501 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a743175a3be85 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:19.099596421 +0000 UTC m=+4.145979910,LastTimestamp:2026-03-07 04:19:19.099596421 +0000 UTC m=+4.145979910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.116089 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a743176d7b11b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:19.119778075 +0000 UTC m=+4.166161564,LastTimestamp:2026-03-07 04:19:19.119778075 +0000 UTC m=+4.166161564,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.120841 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74317a05cf2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:19.173132078 +0000 UTC m=+4.219515567,LastTimestamp:2026-03-07 04:19:19.173132078 +0000 UTC m=+4.219515567,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.125680 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74317af3f7fe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:19.188740094 +0000 UTC m=+4.235123583,LastTimestamp:2026-03-07 04:19:19.188740094 +0000 UTC m=+4.235123583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.129836 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431a45b05eb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:19.883359723 +0000 UTC m=+4.929743212,LastTimestamp:2026-03-07 04:19:19.883359723 +0000 UTC m=+4.929743212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.134917 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431b405b712 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:20.146204434 +0000 UTC m=+5.192587933,LastTimestamp:2026-03-07 04:19:20.146204434 +0000 UTC m=+5.192587933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.139194 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431b4b465b1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:20.157652401 +0000 UTC m=+5.204035930,LastTimestamp:2026-03-07 04:19:20.157652401 +0000 UTC m=+5.204035930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.140709 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431b4cc2524 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:20.15920874 +0000 UTC m=+5.205592319,LastTimestamp:2026-03-07 04:19:20.15920874 +0000 UTC m=+5.205592319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.144419 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431c3e7e1bd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:20.412684733 +0000 UTC m=+5.459068252,LastTimestamp:2026-03-07 04:19:20.412684733 +0000 UTC m=+5.459068252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.145422 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431c50f8a67 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:20.432061031 +0000 UTC m=+5.478444520,LastTimestamp:2026-03-07 04:19:20.432061031 +0000 UTC m=+5.478444520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.148832 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431c51f8f43 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:20.433110851 +0000 UTC m=+5.479494540,LastTimestamp:2026-03-07 04:19:20.433110851 +0000 UTC m=+5.479494540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.154314 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431d4a71992 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:20.693651858 +0000 UTC m=+5.740035397,LastTimestamp:2026-03-07 04:19:20.693651858 +0000 UTC m=+5.740035397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.158064 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431d5cd8474 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:20.712946804 +0000 UTC m=+5.759330333,LastTimestamp:2026-03-07 04:19:20.712946804 +0000 UTC m=+5.759330333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.161764 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431d5e5dd81 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:20.714542465 +0000 UTC m=+5.760925994,LastTimestamp:2026-03-07 04:19:20.714542465 +0000 UTC m=+5.760925994,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.165882 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431e71794e7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:21.003013351 +0000 UTC m=+6.049396880,LastTimestamp:2026-03-07 04:19:21.003013351 +0000 UTC m=+6.049396880,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.170444 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431e833d3f4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:21.021641716 +0000 UTC m=+6.068025235,LastTimestamp:2026-03-07 04:19:21.021641716 +0000 UTC m=+6.068025235,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.182124 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431e84ff117 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:21.023484183 +0000 UTC m=+6.069867702,LastTimestamp:2026-03-07 04:19:21.023484183 +0000 UTC m=+6.069867702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.189350 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431f87b6c81 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:21.294769281 +0000 UTC m=+6.341152780,LastTimestamp:2026-03-07 04:19:21.294769281 +0000 UTC m=+6.341152780,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.196250 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7431f9c587c8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:21.316403144 +0000 UTC m=+6.362786683,LastTimestamp:2026-03-07 04:19:21.316403144 +0000 UTC m=+6.362786683,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.203934 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 07 04:20:01 crc kubenswrapper[4689]: &Event{ObjectMeta:{kube-apiserver-crc.189a7434103c8d6b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:6443/livez": context deadline exceeded Mar 07 04:20:01 crc kubenswrapper[4689]: body: Mar 07 04:20:01 crc kubenswrapper[4689]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:30.283236715 +0000 UTC m=+15.329620204,LastTimestamp:2026-03-07 04:19:30.283236715 +0000 UTC m=+15.329620204,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 04:20:01 crc kubenswrapper[4689]: > Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.209152 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7434103d6963 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:6443/livez\": context deadline exceeded,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:30.283293027 +0000 UTC m=+15.329676516,LastTimestamp:2026-03-07 04:19:30.283293027 +0000 UTC m=+15.329676516,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.215587 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 07 04:20:01 crc kubenswrapper[4689]: &Event{ObjectMeta:{kube-apiserver-crc.189a743420964b7c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 07 04:20:01 crc kubenswrapper[4689]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 07 04:20:01 crc kubenswrapper[4689]: Mar 07 04:20:01 crc kubenswrapper[4689]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:30.557553532 +0000 UTC m=+15.603937061,LastTimestamp:2026-03-07 04:19:30.557553532 +0000 UTC m=+15.603937061,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 04:20:01 crc kubenswrapper[4689]: > Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.222127 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a743420976d42 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:30.557627714 +0000 UTC m=+15.604011243,LastTimestamp:2026-03-07 04:19:30.557627714 +0000 UTC m=+15.604011243,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.228813 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 04:20:01 crc kubenswrapper[4689]: &Event{ObjectMeta:{kube-controller-manager-crc.189a743428fb48b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 07 04:20:01 crc kubenswrapper[4689]: body: Mar 07 04:20:01 crc kubenswrapper[4689]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:30.69838968 +0000 UTC m=+15.744773169,LastTimestamp:2026-03-07 04:19:30.69838968 +0000 UTC m=+15.744773169,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 04:20:01 crc kubenswrapper[4689]: > Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.233643 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a743428fc4c8a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:30.698456202 +0000 UTC m=+15.744839691,LastTimestamp:2026-03-07 04:19:30.698456202 +0000 UTC m=+15.744839691,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.242268 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a74316d8f5e7c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74316d8f5e7c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:18.964043388 +0000 UTC m=+4.010426877,LastTimestamp:2026-03-07 04:19:30.946612398 +0000 UTC m=+15.992995927,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.248640 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a74317a05cf2e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74317a05cf2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:19.173132078 +0000 UTC m=+4.219515567,LastTimestamp:2026-03-07 04:19:31.179156506 +0000 UTC m=+16.225539995,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.256115 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a74317af3f7fe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a74317af3f7fe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:19.188740094 +0000 UTC m=+4.235123583,LastTimestamp:2026-03-07 04:19:31.212582296 +0000 UTC m=+16.258965805,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.265865 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a743428fb48b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 04:20:01 crc kubenswrapper[4689]: &Event{ObjectMeta:{kube-controller-manager-crc.189a743428fb48b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 07 04:20:01 crc kubenswrapper[4689]: body: Mar 07 04:20:01 crc kubenswrapper[4689]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:30.69838968 +0000 UTC m=+15.744773169,LastTimestamp:2026-03-07 04:19:40.697893544 +0000 UTC m=+25.744277033,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 04:20:01 crc kubenswrapper[4689]: > Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.273325 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a743428fc4c8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a743428fc4c8a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:30.698456202 +0000 UTC m=+15.744839691,LastTimestamp:2026-03-07 04:19:40.697948575 +0000 UTC m=+25.744332064,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.280734 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 04:20:01 crc kubenswrapper[4689]: &Event{ObjectMeta:{kube-controller-manager-crc.189a7438620462fe openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:38854->192.168.126.11:10357: read: connection reset by peer Mar 07 04:20:01 crc kubenswrapper[4689]: body: Mar 07 04:20:01 crc kubenswrapper[4689]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:48.835156734 +0000 UTC m=+33.881540233,LastTimestamp:2026-03-07 04:19:48.835156734 +0000 UTC m=+33.881540233,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 04:20:01 crc kubenswrapper[4689]: > Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.286810 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a74386205c29a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:38854->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:48.835246746 +0000 UTC m=+33.881630245,LastTimestamp:2026-03-07 04:19:48.835246746 +0000 UTC m=+33.881630245,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.294747 4689 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a743862340fe1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:48.838281185 +0000 UTC m=+33.884664694,LastTimestamp:2026-03-07 04:19:48.838281185 +0000 UTC m=+33.884664694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.300333 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a743103f956b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a743103f956b2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.192603314 +0000 UTC m=+2.238986813,LastTimestamp:2026-03-07 04:19:49.361022773 +0000 UTC m=+34.407406262,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.305651 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a74311bc30a15\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a74311bc30a15 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.591697941 +0000 UTC m=+2.638081470,LastTimestamp:2026-03-07 04:19:49.568751255 +0000 UTC m=+34.615134744,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.311754 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a74311d53e7f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a74311d53e7f4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:17.61796914 +0000 UTC m=+2.664352669,LastTimestamp:2026-03-07 04:19:49.578306783 +0000 UTC m=+34.624690272,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.320782 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a743428fb48b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 04:20:01 crc kubenswrapper[4689]: &Event{ObjectMeta:{kube-controller-manager-crc.189a743428fb48b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 07 04:20:01 crc kubenswrapper[4689]: body: Mar 07 04:20:01 crc kubenswrapper[4689]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:30.69838968 +0000 UTC m=+15.744773169,LastTimestamp:2026-03-07 04:20:00.704567474 +0000 UTC m=+45.750950963,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 04:20:01 crc kubenswrapper[4689]: > Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.327791 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a743428fc4c8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a743428fc4c8a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:30.698456202 +0000 UTC m=+15.744839691,LastTimestamp:2026-03-07 04:20:00.704611595 +0000 UTC m=+45.750995084,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:20:01 crc kubenswrapper[4689]: I0307 04:20:01.399687 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:20:01 crc kubenswrapper[4689]: I0307 04:20:01.399959 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:01 crc kubenswrapper[4689]: I0307 04:20:01.401727 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:01 crc kubenswrapper[4689]: I0307 04:20:01.401790 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:01 crc kubenswrapper[4689]: I0307 04:20:01.401812 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:01 crc kubenswrapper[4689]: I0307 04:20:01.402682 4689 scope.go:117] "RemoveContainer" containerID="a20b50dd92782d6f089871eaa4c9604e2ac0a53ef15148842e8b28e0c319c808" Mar 07 04:20:01 crc kubenswrapper[4689]: E0307 04:20:01.402999 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:20:01 crc kubenswrapper[4689]: I0307 04:20:01.746298 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:02 crc kubenswrapper[4689]: I0307 04:20:02.747853 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:02 crc kubenswrapper[4689]: W0307 04:20:02.835384 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 07 04:20:02 crc kubenswrapper[4689]: E0307 04:20:02.835476 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 07 04:20:03 crc kubenswrapper[4689]: I0307 04:20:03.746620 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:04 crc kubenswrapper[4689]: I0307 04:20:04.747924 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:04 crc kubenswrapper[4689]: E0307 04:20:04.982576 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 04:20:04 crc kubenswrapper[4689]: I0307 04:20:04.998550 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.000633 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.000795 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.000878 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.000977 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:20:05 crc kubenswrapper[4689]: E0307 04:20:05.009646 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.124117 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.124363 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.125962 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.126013 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.126035 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.462453 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.463332 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.465227 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.465299 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.465314 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.466043 4689 scope.go:117] "RemoveContainer" containerID="a20b50dd92782d6f089871eaa4c9604e2ac0a53ef15148842e8b28e0c319c808" Mar 07 04:20:05 crc kubenswrapper[4689]: E0307 04:20:05.466281 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:20:05 crc kubenswrapper[4689]: I0307 04:20:05.747691 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:05 crc kubenswrapper[4689]: E0307 04:20:05.909282 4689 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 04:20:06 crc kubenswrapper[4689]: I0307 04:20:06.747424 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:06 crc kubenswrapper[4689]: W0307 04:20:06.957313 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:06 crc kubenswrapper[4689]: E0307 04:20:06.957396 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 07 04:20:07 crc kubenswrapper[4689]: I0307 04:20:07.746850 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:07 crc kubenswrapper[4689]: W0307 04:20:07.882282 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 07 04:20:07 crc kubenswrapper[4689]: E0307 04:20:07.882370 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 07 04:20:07 crc kubenswrapper[4689]: W0307 04:20:07.964455 4689 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 07 04:20:07 crc kubenswrapper[4689]: E0307 04:20:07.964543 4689 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 07 04:20:08 crc kubenswrapper[4689]: I0307 04:20:08.746971 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:09 crc kubenswrapper[4689]: I0307 04:20:09.745242 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:10 crc kubenswrapper[4689]: I0307 04:20:10.697856 4689 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 04:20:10 crc kubenswrapper[4689]: I0307 04:20:10.697971 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 04:20:10 crc kubenswrapper[4689]: E0307 04:20:10.704344 4689 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a743428fb48b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 04:20:10 crc kubenswrapper[4689]: &Event{ObjectMeta:{kube-controller-manager-crc.189a743428fb48b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 07 04:20:10 crc kubenswrapper[4689]: body: Mar 07 04:20:10 crc kubenswrapper[4689]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:19:30.69838968 +0000 UTC m=+15.744773169,LastTimestamp:2026-03-07 04:20:10.69794337 +0000 UTC m=+55.744326889,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 04:20:10 crc kubenswrapper[4689]: > Mar 07 04:20:10 crc kubenswrapper[4689]: I0307 04:20:10.746393 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:11 crc kubenswrapper[4689]: I0307 04:20:11.745673 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:11 crc kubenswrapper[4689]: E0307 04:20:11.990849 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 04:20:12 crc kubenswrapper[4689]: I0307 04:20:12.010452 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:12 crc kubenswrapper[4689]: I0307 04:20:12.012432 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:12 crc kubenswrapper[4689]: I0307 04:20:12.012477 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:12 crc kubenswrapper[4689]: I0307 04:20:12.012504 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:12 crc kubenswrapper[4689]: I0307 04:20:12.012548 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:20:12 crc kubenswrapper[4689]: E0307 04:20:12.019366 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 07 04:20:12 crc kubenswrapper[4689]: I0307 04:20:12.747121 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:13 crc kubenswrapper[4689]: I0307 04:20:13.746568 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:14 crc kubenswrapper[4689]: I0307 04:20:14.746680 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:15 crc kubenswrapper[4689]: I0307 04:20:15.749004 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:15 crc kubenswrapper[4689]: E0307 04:20:15.909866 4689 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 04:20:16 crc kubenswrapper[4689]: I0307 04:20:16.745776 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:17 crc kubenswrapper[4689]: I0307 04:20:17.703961 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:20:17 crc kubenswrapper[4689]: I0307 04:20:17.704222 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:17 crc kubenswrapper[4689]: I0307 04:20:17.705852 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:17 crc kubenswrapper[4689]: I0307 04:20:17.705890 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:17 crc kubenswrapper[4689]: I0307 04:20:17.705902 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:17 crc kubenswrapper[4689]: I0307 04:20:17.707770 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:20:17 crc kubenswrapper[4689]: I0307 04:20:17.746986 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:18 crc kubenswrapper[4689]: I0307 04:20:18.112385 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:18 crc kubenswrapper[4689]: I0307 04:20:18.113861 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:18 crc kubenswrapper[4689]: I0307 04:20:18.113913 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:18 crc kubenswrapper[4689]: I0307 04:20:18.113928 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:18 crc kubenswrapper[4689]: I0307 04:20:18.745011 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:18 crc kubenswrapper[4689]: E0307 04:20:18.996516 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 04:20:19 crc kubenswrapper[4689]: I0307 04:20:19.019766 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:19 crc kubenswrapper[4689]: I0307 04:20:19.021037 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:19 crc kubenswrapper[4689]: I0307 04:20:19.021069 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:19 crc kubenswrapper[4689]: I0307 04:20:19.021080 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:19 crc kubenswrapper[4689]: I0307 04:20:19.021102 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:20:19 crc kubenswrapper[4689]: E0307 04:20:19.025806 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 07 04:20:19 crc kubenswrapper[4689]: I0307 04:20:19.744876 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:20 crc kubenswrapper[4689]: I0307 04:20:20.754326 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:20 crc kubenswrapper[4689]: I0307 04:20:20.825465 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:20 crc kubenswrapper[4689]: I0307 04:20:20.826725 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:20 crc kubenswrapper[4689]: I0307 04:20:20.826761 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:20 crc kubenswrapper[4689]: I0307 04:20:20.826819 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:20 crc kubenswrapper[4689]: I0307 04:20:20.827452 4689 scope.go:117] "RemoveContainer" containerID="a20b50dd92782d6f089871eaa4c9604e2ac0a53ef15148842e8b28e0c319c808" Mar 07 04:20:21 crc kubenswrapper[4689]: I0307 04:20:21.123776 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 07 04:20:21 crc kubenswrapper[4689]: I0307 04:20:21.126002 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301"} Mar 07 04:20:21 crc kubenswrapper[4689]: I0307 04:20:21.126261 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:21 crc kubenswrapper[4689]: I0307 04:20:21.127370 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:21 crc kubenswrapper[4689]: I0307 04:20:21.127409 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:21 crc kubenswrapper[4689]: I0307 04:20:21.127418 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:21 crc kubenswrapper[4689]: I0307 04:20:21.400328 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:20:21 crc kubenswrapper[4689]: I0307 04:20:21.744049 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.130580 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.132245 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.134128 4689 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301" exitCode=255 Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.134198 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301"} Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.134253 4689 scope.go:117] "RemoveContainer" containerID="a20b50dd92782d6f089871eaa4c9604e2ac0a53ef15148842e8b28e0c319c808" Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.134683 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.135685 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.135714 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.135725 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.143802 4689 scope.go:117] "RemoveContainer" containerID="504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301" Mar 07 04:20:22 crc kubenswrapper[4689]: E0307 04:20:22.144084 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.742886 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.825729 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.827480 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.827527 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:22 crc kubenswrapper[4689]: I0307 04:20:22.827545 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:23 crc kubenswrapper[4689]: I0307 04:20:23.139597 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 04:20:23 crc kubenswrapper[4689]: I0307 04:20:23.142059 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:23 crc kubenswrapper[4689]: I0307 04:20:23.142875 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:23 crc kubenswrapper[4689]: I0307 04:20:23.142910 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:23 crc kubenswrapper[4689]: I0307 04:20:23.142921 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:23 crc kubenswrapper[4689]: I0307 04:20:23.143508 4689 scope.go:117] "RemoveContainer" containerID="504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301" Mar 07 04:20:23 crc kubenswrapper[4689]: E0307 04:20:23.143670 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:20:23 crc kubenswrapper[4689]: I0307 04:20:23.742760 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:24 crc kubenswrapper[4689]: I0307 04:20:24.743589 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:25 crc kubenswrapper[4689]: I0307 04:20:25.461372 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:20:25 crc kubenswrapper[4689]: I0307 04:20:25.461570 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:25 crc kubenswrapper[4689]: I0307 04:20:25.462787 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:25 crc kubenswrapper[4689]: I0307 04:20:25.462827 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:25 crc kubenswrapper[4689]: I0307 04:20:25.462842 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:25 crc kubenswrapper[4689]: I0307 04:20:25.463602 4689 scope.go:117] "RemoveContainer" containerID="504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301" Mar 07 04:20:25 crc kubenswrapper[4689]: E0307 04:20:25.463812 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:20:25 crc kubenswrapper[4689]: I0307 04:20:25.745157 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:25 crc kubenswrapper[4689]: E0307 04:20:25.911792 4689 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 04:20:26 crc kubenswrapper[4689]: E0307 04:20:26.005070 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 04:20:26 crc kubenswrapper[4689]: I0307 04:20:26.026037 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:26 crc kubenswrapper[4689]: I0307 04:20:26.027589 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:26 crc kubenswrapper[4689]: I0307 04:20:26.027643 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:26 crc kubenswrapper[4689]: I0307 04:20:26.027654 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:26 crc kubenswrapper[4689]: I0307 04:20:26.027696 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:20:26 crc kubenswrapper[4689]: E0307 04:20:26.033429 4689 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 07 04:20:26 crc kubenswrapper[4689]: I0307 04:20:26.744649 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:27 crc kubenswrapper[4689]: I0307 04:20:27.746486 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:28 crc kubenswrapper[4689]: I0307 04:20:28.109854 4689 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 04:20:28 crc kubenswrapper[4689]: I0307 04:20:28.126883 4689 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 07 04:20:28 crc kubenswrapper[4689]: I0307 04:20:28.743877 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:29 crc kubenswrapper[4689]: I0307 04:20:29.746927 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:30 crc kubenswrapper[4689]: I0307 04:20:30.746123 4689 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 04:20:31 crc kubenswrapper[4689]: I0307 04:20:31.123358 4689 csr.go:261] certificate signing request csr-7mspc is approved, waiting to be issued Mar 07 04:20:31 crc kubenswrapper[4689]: I0307 04:20:31.134401 4689 csr.go:257] certificate signing request csr-7mspc is issued Mar 07 04:20:31 crc kubenswrapper[4689]: I0307 04:20:31.231316 4689 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 07 04:20:31 crc kubenswrapper[4689]: I0307 04:20:31.581761 4689 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 07 04:20:32 crc kubenswrapper[4689]: I0307 04:20:32.136562 4689 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-20 04:08:24.119093132 +0000 UTC Mar 07 04:20:32 crc kubenswrapper[4689]: I0307 04:20:32.136640 4689 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6911h47m51.982456924s for next certificate rotation Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.034525 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.036069 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.036123 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.036143 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.036384 4689 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.048254 4689 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.048685 4689 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.048722 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.053497 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.053561 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.053576 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.053600 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.053616 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:33Z","lastTransitionTime":"2026-03-07T04:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.072824 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.081452 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.081523 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.081543 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.081568 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.081586 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:33Z","lastTransitionTime":"2026-03-07T04:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.096372 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.106703 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.106766 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.106786 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.106810 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.106828 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:33Z","lastTransitionTime":"2026-03-07T04:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.123335 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.134157 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.134263 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.134299 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.134331 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:33 crc kubenswrapper[4689]: I0307 04:20:33.134353 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:33Z","lastTransitionTime":"2026-03-07T04:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.150808 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.151196 4689 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.151260 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.252258 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.352573 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.453257 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.554314 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.655278 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.756294 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.856846 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:33 crc kubenswrapper[4689]: E0307 04:20:33.957976 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:34 crc kubenswrapper[4689]: E0307 04:20:34.058727 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:34 crc kubenswrapper[4689]: E0307 04:20:34.158929 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:34 crc kubenswrapper[4689]: E0307 04:20:34.259761 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:34 crc kubenswrapper[4689]: E0307 04:20:34.360275 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:34 crc kubenswrapper[4689]: I0307 04:20:34.375903 4689 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 07 04:20:34 crc kubenswrapper[4689]: E0307 04:20:34.460768 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:34 crc kubenswrapper[4689]: E0307 04:20:34.561739 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:34 crc kubenswrapper[4689]: E0307 04:20:34.662278 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:34 crc kubenswrapper[4689]: E0307 04:20:34.763188 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:34 crc kubenswrapper[4689]: E0307 04:20:34.863580 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:34 crc kubenswrapper[4689]: E0307 04:20:34.964273 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:35 crc kubenswrapper[4689]: E0307 04:20:35.065281 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:35 crc kubenswrapper[4689]: E0307 04:20:35.166003 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:35 crc kubenswrapper[4689]: E0307 04:20:35.266413 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:35 crc kubenswrapper[4689]: E0307 04:20:35.366929 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:35 crc kubenswrapper[4689]: E0307 04:20:35.467802 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:35 crc kubenswrapper[4689]: E0307 04:20:35.568981 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:35 crc kubenswrapper[4689]: E0307 04:20:35.669885 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:35 crc kubenswrapper[4689]: E0307 04:20:35.770050 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:35 crc kubenswrapper[4689]: E0307 04:20:35.870272 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:35 crc kubenswrapper[4689]: E0307 04:20:35.913092 4689 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 04:20:35 crc kubenswrapper[4689]: E0307 04:20:35.971068 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:36 crc kubenswrapper[4689]: E0307 04:20:36.072089 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:36 crc kubenswrapper[4689]: E0307 04:20:36.173327 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:36 crc kubenswrapper[4689]: E0307 04:20:36.274026 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:36 crc kubenswrapper[4689]: E0307 04:20:36.374340 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:36 crc kubenswrapper[4689]: E0307 04:20:36.474992 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:36 crc kubenswrapper[4689]: E0307 04:20:36.576000 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:36 crc kubenswrapper[4689]: E0307 04:20:36.676166 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:36 crc kubenswrapper[4689]: E0307 04:20:36.776942 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:36 crc kubenswrapper[4689]: E0307 04:20:36.877796 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:36 crc kubenswrapper[4689]: E0307 04:20:36.978372 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:37 crc kubenswrapper[4689]: E0307 04:20:37.079472 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:37 crc kubenswrapper[4689]: E0307 04:20:37.180069 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:37 crc kubenswrapper[4689]: E0307 04:20:37.280549 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:37 crc kubenswrapper[4689]: E0307 04:20:37.381651 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:37 crc kubenswrapper[4689]: E0307 04:20:37.482276 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:37 crc kubenswrapper[4689]: E0307 04:20:37.583536 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:37 crc kubenswrapper[4689]: E0307 04:20:37.684365 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:37 crc kubenswrapper[4689]: E0307 04:20:37.785486 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:37 crc kubenswrapper[4689]: I0307 04:20:37.826091 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:37 crc kubenswrapper[4689]: I0307 04:20:37.827975 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:37 crc kubenswrapper[4689]: I0307 04:20:37.828047 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:37 crc kubenswrapper[4689]: I0307 04:20:37.828074 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:37 crc kubenswrapper[4689]: I0307 04:20:37.829151 4689 scope.go:117] "RemoveContainer" containerID="504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301" Mar 07 04:20:37 crc kubenswrapper[4689]: E0307 04:20:37.829488 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:20:37 crc kubenswrapper[4689]: E0307 04:20:37.886352 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:37 crc kubenswrapper[4689]: E0307 04:20:37.987383 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:38 crc kubenswrapper[4689]: I0307 04:20:38.029517 4689 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 07 04:20:38 crc kubenswrapper[4689]: E0307 04:20:38.088270 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:38 crc kubenswrapper[4689]: E0307 04:20:38.189259 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:38 crc kubenswrapper[4689]: E0307 04:20:38.289679 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:38 crc kubenswrapper[4689]: E0307 04:20:38.390030 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:38 crc kubenswrapper[4689]: E0307 04:20:38.490807 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:38 crc kubenswrapper[4689]: E0307 04:20:38.591234 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:38 crc kubenswrapper[4689]: E0307 04:20:38.691840 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:38 crc kubenswrapper[4689]: E0307 04:20:38.792298 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:38 crc kubenswrapper[4689]: E0307 04:20:38.892974 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:38 crc kubenswrapper[4689]: E0307 04:20:38.993640 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:39 crc kubenswrapper[4689]: E0307 04:20:39.094840 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:39 crc kubenswrapper[4689]: E0307 04:20:39.195304 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:39 crc kubenswrapper[4689]: E0307 04:20:39.295650 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:39 crc kubenswrapper[4689]: E0307 04:20:39.396803 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:39 crc kubenswrapper[4689]: E0307 04:20:39.497261 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:39 crc kubenswrapper[4689]: E0307 04:20:39.597487 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:39 crc kubenswrapper[4689]: E0307 04:20:39.698410 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:39 crc kubenswrapper[4689]: E0307 04:20:39.798603 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:39 crc kubenswrapper[4689]: E0307 04:20:39.898735 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:39 crc kubenswrapper[4689]: E0307 04:20:39.999313 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:40 crc kubenswrapper[4689]: E0307 04:20:40.099845 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:40 crc kubenswrapper[4689]: E0307 04:20:40.200731 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:40 crc kubenswrapper[4689]: E0307 04:20:40.301392 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:40 crc kubenswrapper[4689]: E0307 04:20:40.402607 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:40 crc kubenswrapper[4689]: E0307 04:20:40.503112 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:40 crc kubenswrapper[4689]: E0307 04:20:40.604267 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:40 crc kubenswrapper[4689]: E0307 04:20:40.705105 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:40 crc kubenswrapper[4689]: E0307 04:20:40.806018 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:40 crc kubenswrapper[4689]: E0307 04:20:40.906670 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:41 crc kubenswrapper[4689]: E0307 04:20:41.007798 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:41 crc kubenswrapper[4689]: E0307 04:20:41.108301 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:41 crc kubenswrapper[4689]: E0307 04:20:41.208936 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:41 crc kubenswrapper[4689]: E0307 04:20:41.309609 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:41 crc kubenswrapper[4689]: E0307 04:20:41.410844 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:41 crc kubenswrapper[4689]: E0307 04:20:41.511417 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:41 crc kubenswrapper[4689]: E0307 04:20:41.612565 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:41 crc kubenswrapper[4689]: E0307 04:20:41.713532 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:41 crc kubenswrapper[4689]: E0307 04:20:41.814285 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:41 crc kubenswrapper[4689]: E0307 04:20:41.915443 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:42 crc kubenswrapper[4689]: E0307 04:20:42.016297 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:42 crc kubenswrapper[4689]: E0307 04:20:42.116759 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:42 crc kubenswrapper[4689]: E0307 04:20:42.216888 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:42 crc kubenswrapper[4689]: E0307 04:20:42.318487 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:42 crc kubenswrapper[4689]: E0307 04:20:42.418926 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:42 crc kubenswrapper[4689]: E0307 04:20:42.519944 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:42 crc kubenswrapper[4689]: E0307 04:20:42.620573 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:42 crc kubenswrapper[4689]: E0307 04:20:42.721547 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:42 crc kubenswrapper[4689]: E0307 04:20:42.822104 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:42 crc kubenswrapper[4689]: E0307 04:20:42.922614 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.023122 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.123900 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.224385 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.245525 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.251929 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.252005 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.252029 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.252080 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.252107 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:43Z","lastTransitionTime":"2026-03-07T04:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.269546 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.274762 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.274833 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.274856 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.274886 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.274910 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:43Z","lastTransitionTime":"2026-03-07T04:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.293614 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.299585 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.299631 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.299649 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.299674 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.299693 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:43Z","lastTransitionTime":"2026-03-07T04:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.314842 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.319577 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.319635 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.319653 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.319679 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:43 crc kubenswrapper[4689]: I0307 04:20:43.319697 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:43Z","lastTransitionTime":"2026-03-07T04:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.336535 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.336825 4689 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.336884 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.437683 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.538645 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.639499 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.740788 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.841351 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:43 crc kubenswrapper[4689]: E0307 04:20:43.941685 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:44 crc kubenswrapper[4689]: E0307 04:20:44.042774 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:44 crc kubenswrapper[4689]: E0307 04:20:44.142923 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:44 crc kubenswrapper[4689]: E0307 04:20:44.244080 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:44 crc kubenswrapper[4689]: E0307 04:20:44.345018 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:44 crc kubenswrapper[4689]: E0307 04:20:44.445989 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:44 crc kubenswrapper[4689]: E0307 04:20:44.546556 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:44 crc kubenswrapper[4689]: E0307 04:20:44.647790 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:44 crc kubenswrapper[4689]: E0307 04:20:44.749049 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:44 crc kubenswrapper[4689]: E0307 04:20:44.849457 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:44 crc kubenswrapper[4689]: E0307 04:20:44.950149 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:45 crc kubenswrapper[4689]: E0307 04:20:45.051116 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:45 crc kubenswrapper[4689]: E0307 04:20:45.152090 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:45 crc kubenswrapper[4689]: E0307 04:20:45.252372 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:45 crc kubenswrapper[4689]: E0307 04:20:45.352811 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:45 crc kubenswrapper[4689]: E0307 04:20:45.453021 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:45 crc kubenswrapper[4689]: E0307 04:20:45.553980 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:45 crc kubenswrapper[4689]: E0307 04:20:45.654992 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:45 crc kubenswrapper[4689]: E0307 04:20:45.755795 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:45 crc kubenswrapper[4689]: E0307 04:20:45.856407 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:45 crc kubenswrapper[4689]: E0307 04:20:45.913726 4689 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 04:20:45 crc kubenswrapper[4689]: E0307 04:20:45.957433 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:46 crc kubenswrapper[4689]: E0307 04:20:46.057912 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:46 crc kubenswrapper[4689]: E0307 04:20:46.158680 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:46 crc kubenswrapper[4689]: E0307 04:20:46.259492 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:46 crc kubenswrapper[4689]: E0307 04:20:46.359713 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:46 crc kubenswrapper[4689]: E0307 04:20:46.460628 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:46 crc kubenswrapper[4689]: E0307 04:20:46.561761 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:46 crc kubenswrapper[4689]: E0307 04:20:46.662254 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:46 crc kubenswrapper[4689]: E0307 04:20:46.762932 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:46 crc kubenswrapper[4689]: E0307 04:20:46.864049 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:46 crc kubenswrapper[4689]: E0307 04:20:46.964811 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:47 crc kubenswrapper[4689]: E0307 04:20:47.065990 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:47 crc kubenswrapper[4689]: E0307 04:20:47.166377 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:47 crc kubenswrapper[4689]: E0307 04:20:47.267533 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:47 crc kubenswrapper[4689]: E0307 04:20:47.368347 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:47 crc kubenswrapper[4689]: E0307 04:20:47.468607 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:47 crc kubenswrapper[4689]: E0307 04:20:47.569251 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:47 crc kubenswrapper[4689]: E0307 04:20:47.670086 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:47 crc kubenswrapper[4689]: E0307 04:20:47.771012 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:47 crc kubenswrapper[4689]: E0307 04:20:47.871318 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:47 crc kubenswrapper[4689]: E0307 04:20:47.972327 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:48 crc kubenswrapper[4689]: E0307 04:20:48.073144 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:48 crc kubenswrapper[4689]: E0307 04:20:48.174056 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:48 crc kubenswrapper[4689]: E0307 04:20:48.274772 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:48 crc kubenswrapper[4689]: E0307 04:20:48.375853 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:48 crc kubenswrapper[4689]: E0307 04:20:48.476959 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:48 crc kubenswrapper[4689]: E0307 04:20:48.577390 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:48 crc kubenswrapper[4689]: E0307 04:20:48.678016 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:48 crc kubenswrapper[4689]: E0307 04:20:48.778647 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:48 crc kubenswrapper[4689]: E0307 04:20:48.879741 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:48 crc kubenswrapper[4689]: E0307 04:20:48.980271 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:49 crc kubenswrapper[4689]: E0307 04:20:49.080763 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:49 crc kubenswrapper[4689]: E0307 04:20:49.181384 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:49 crc kubenswrapper[4689]: E0307 04:20:49.282527 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:49 crc kubenswrapper[4689]: E0307 04:20:49.383742 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:49 crc kubenswrapper[4689]: E0307 04:20:49.483924 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:49 crc kubenswrapper[4689]: E0307 04:20:49.584124 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:49 crc kubenswrapper[4689]: E0307 04:20:49.684831 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:49 crc kubenswrapper[4689]: E0307 04:20:49.785624 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:49 crc kubenswrapper[4689]: E0307 04:20:49.886636 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:49 crc kubenswrapper[4689]: E0307 04:20:49.987675 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:50 crc kubenswrapper[4689]: E0307 04:20:50.088422 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:50 crc kubenswrapper[4689]: E0307 04:20:50.188835 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:50 crc kubenswrapper[4689]: E0307 04:20:50.289642 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:50 crc kubenswrapper[4689]: E0307 04:20:50.390275 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:50 crc kubenswrapper[4689]: E0307 04:20:50.491355 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:50 crc kubenswrapper[4689]: E0307 04:20:50.595489 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:50 crc kubenswrapper[4689]: E0307 04:20:50.695734 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:50 crc kubenswrapper[4689]: E0307 04:20:50.796825 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:50 crc kubenswrapper[4689]: E0307 04:20:50.896976 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:50 crc kubenswrapper[4689]: E0307 04:20:50.997888 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:51 crc kubenswrapper[4689]: E0307 04:20:51.099061 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:51 crc kubenswrapper[4689]: E0307 04:20:51.199580 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:51 crc kubenswrapper[4689]: E0307 04:20:51.300127 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:51 crc kubenswrapper[4689]: E0307 04:20:51.400374 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:51 crc kubenswrapper[4689]: E0307 04:20:51.500500 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:51 crc kubenswrapper[4689]: E0307 04:20:51.600722 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:51 crc kubenswrapper[4689]: E0307 04:20:51.701621 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:51 crc kubenswrapper[4689]: E0307 04:20:51.802540 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:51 crc kubenswrapper[4689]: E0307 04:20:51.903497 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:52 crc kubenswrapper[4689]: E0307 04:20:52.004021 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:52 crc kubenswrapper[4689]: E0307 04:20:52.104505 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:52 crc kubenswrapper[4689]: E0307 04:20:52.204892 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:52 crc kubenswrapper[4689]: E0307 04:20:52.305376 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:52 crc kubenswrapper[4689]: E0307 04:20:52.406393 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:52 crc kubenswrapper[4689]: E0307 04:20:52.507579 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:52 crc kubenswrapper[4689]: E0307 04:20:52.607975 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:52 crc kubenswrapper[4689]: E0307 04:20:52.708385 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:52 crc kubenswrapper[4689]: E0307 04:20:52.808930 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:52 crc kubenswrapper[4689]: I0307 04:20:52.825446 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:52 crc kubenswrapper[4689]: I0307 04:20:52.827127 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:52 crc kubenswrapper[4689]: I0307 04:20:52.827225 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:52 crc kubenswrapper[4689]: I0307 04:20:52.827240 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:52 crc kubenswrapper[4689]: I0307 04:20:52.828229 4689 scope.go:117] "RemoveContainer" containerID="504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301" Mar 07 04:20:52 crc kubenswrapper[4689]: E0307 04:20:52.828480 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 04:20:52 crc kubenswrapper[4689]: E0307 04:20:52.909586 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.009737 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.110468 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.211031 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.312119 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.413027 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.514387 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.615452 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.642358 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.648534 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.648614 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.648633 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.648659 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.648682 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:53Z","lastTransitionTime":"2026-03-07T04:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.665265 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.670722 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.670781 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.670799 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.670825 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.670843 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:53Z","lastTransitionTime":"2026-03-07T04:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.686464 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.691737 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.691815 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.691843 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.691875 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.691905 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:53Z","lastTransitionTime":"2026-03-07T04:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.705364 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.711017 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.711066 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.711078 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.711099 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:53 crc kubenswrapper[4689]: I0307 04:20:53.711113 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:53Z","lastTransitionTime":"2026-03-07T04:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.726525 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.726648 4689 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.726684 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.827717 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:53 crc kubenswrapper[4689]: E0307 04:20:53.928303 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:54 crc kubenswrapper[4689]: E0307 04:20:54.029228 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:54 crc kubenswrapper[4689]: E0307 04:20:54.129867 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:54 crc kubenswrapper[4689]: E0307 04:20:54.230328 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:54 crc kubenswrapper[4689]: E0307 04:20:54.330937 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:54 crc kubenswrapper[4689]: E0307 04:20:54.431107 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:54 crc kubenswrapper[4689]: E0307 04:20:54.531834 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:54 crc kubenswrapper[4689]: E0307 04:20:54.632305 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:54 crc kubenswrapper[4689]: E0307 04:20:54.733291 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:54 crc kubenswrapper[4689]: E0307 04:20:54.834419 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:54 crc kubenswrapper[4689]: E0307 04:20:54.934789 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:55 crc kubenswrapper[4689]: E0307 04:20:55.035253 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:55 crc kubenswrapper[4689]: E0307 04:20:55.135382 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:55 crc kubenswrapper[4689]: E0307 04:20:55.236268 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:55 crc kubenswrapper[4689]: E0307 04:20:55.336823 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:55 crc kubenswrapper[4689]: E0307 04:20:55.437756 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:55 crc kubenswrapper[4689]: E0307 04:20:55.538545 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:55 crc kubenswrapper[4689]: E0307 04:20:55.638861 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:55 crc kubenswrapper[4689]: E0307 04:20:55.739503 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:55 crc kubenswrapper[4689]: I0307 04:20:55.783695 4689 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 07 04:20:55 crc kubenswrapper[4689]: E0307 04:20:55.840147 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:55 crc kubenswrapper[4689]: E0307 04:20:55.914285 4689 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 04:20:55 crc kubenswrapper[4689]: E0307 04:20:55.941341 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:56 crc kubenswrapper[4689]: E0307 04:20:56.042357 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:56 crc kubenswrapper[4689]: E0307 04:20:56.142928 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:56 crc kubenswrapper[4689]: E0307 04:20:56.243785 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:56 crc kubenswrapper[4689]: E0307 04:20:56.344662 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:56 crc kubenswrapper[4689]: E0307 04:20:56.445026 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:56 crc kubenswrapper[4689]: E0307 04:20:56.545474 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:56 crc kubenswrapper[4689]: E0307 04:20:56.646215 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:56 crc kubenswrapper[4689]: E0307 04:20:56.747260 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:56 crc kubenswrapper[4689]: I0307 04:20:56.825017 4689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 04:20:56 crc kubenswrapper[4689]: I0307 04:20:56.826958 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:56 crc kubenswrapper[4689]: I0307 04:20:56.827040 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:56 crc kubenswrapper[4689]: I0307 04:20:56.827063 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:56 crc kubenswrapper[4689]: E0307 04:20:56.847915 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:56 crc kubenswrapper[4689]: E0307 04:20:56.948801 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:57 crc kubenswrapper[4689]: E0307 04:20:57.049429 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:57 crc kubenswrapper[4689]: E0307 04:20:57.149868 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:57 crc kubenswrapper[4689]: E0307 04:20:57.250557 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:57 crc kubenswrapper[4689]: E0307 04:20:57.350762 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:57 crc kubenswrapper[4689]: E0307 04:20:57.451983 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:57 crc kubenswrapper[4689]: E0307 04:20:57.553209 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:57 crc kubenswrapper[4689]: E0307 04:20:57.654130 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:57 crc kubenswrapper[4689]: E0307 04:20:57.754401 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:57 crc kubenswrapper[4689]: E0307 04:20:57.855222 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:57 crc kubenswrapper[4689]: E0307 04:20:57.955531 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.056006 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.156377 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.256993 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.358551 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.459052 4689 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.490567 4689 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.562493 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.562548 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.562570 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.562595 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.562642 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:58Z","lastTransitionTime":"2026-03-07T04:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.677032 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.677092 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.677114 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.677143 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.677166 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:58Z","lastTransitionTime":"2026-03-07T04:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.779816 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.779866 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.779881 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.779899 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.779913 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:58Z","lastTransitionTime":"2026-03-07T04:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.802098 4689 apiserver.go:52] "Watching apiserver" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.812362 4689 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.813125 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9bxdn","openshift-machine-config-operator/machine-config-daemon-dss5c","openshift-multus/multus-wmhqx","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-j9bx5","openshift-image-registry/node-ca-9vncl","openshift-multus/multus-additional-cni-plugins-gjvmk","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.813831 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.813965 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.814007 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.814075 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.814242 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.814430 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.814453 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.816020 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.816208 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.816381 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.816651 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.817318 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9bxdn" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.817480 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9vncl" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.817938 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.818202 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.818923 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.819129 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.820048 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.820711 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.820826 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.820934 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.824939 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.825133 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.825317 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.825888 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.826112 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.826686 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.826850 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.827000 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.827163 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.827776 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.828016 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.829003 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.829196 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.829375 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.829500 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.829569 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.829711 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.829740 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.829791 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.829791 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.829907 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.830022 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.830080 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.830238 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.830278 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.830516 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.831108 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.832228 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.838443 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.847107 4689 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.848636 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.864777 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.882667 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.882719 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.882731 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.882752 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.882764 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:58Z","lastTransitionTime":"2026-03-07T04:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.883720 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.901987 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.914080 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915294 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915339 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915362 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915381 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915405 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915420 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915440 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915457 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915519 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915539 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915556 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915572 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915590 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915607 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915625 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915643 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915659 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915682 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915699 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915721 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915745 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915764 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915782 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915799 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915818 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915841 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915859 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915882 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915904 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915923 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915940 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915956 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915973 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.915995 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916018 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916033 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916050 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916068 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916083 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916101 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916119 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916136 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916155 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916190 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916211 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916229 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916249 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916267 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916285 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916305 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916325 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916346 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916365 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916387 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916409 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916430 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916451 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916476 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916498 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916517 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916537 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916555 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916578 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916598 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916616 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916635 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916651 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916667 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916704 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916721 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916737 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916752 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916768 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916757 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916787 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916899 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.916932 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.917134 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.917883 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.917969 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918292 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918307 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918353 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918391 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918413 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918439 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918463 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918486 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918505 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918530 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918555 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918575 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918596 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918621 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918641 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918672 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918698 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918716 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918743 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918765 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918786 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918824 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918842 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918860 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918882 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918902 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918989 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919013 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919036 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919059 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919087 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919111 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919129 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919149 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919189 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919209 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919229 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919250 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919269 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919289 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919307 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919329 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919352 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919376 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919398 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919421 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919443 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919467 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919491 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919513 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919536 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919566 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919590 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919615 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919638 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919667 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919691 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919719 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919748 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919844 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919877 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919904 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919932 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919961 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919995 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920020 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920044 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920076 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920098 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920121 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920147 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920208 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920235 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920258 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920283 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920311 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920338 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920363 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920389 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920418 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920465 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920493 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920520 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920548 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920575 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920602 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920627 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920652 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920682 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920707 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920731 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920757 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920785 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920815 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920846 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920892 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920925 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920954 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920985 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921013 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921042 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921069 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921098 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921129 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921155 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921201 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921226 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921252 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921283 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921310 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921338 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921368 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921400 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921428 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921457 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921483 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921511 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921540 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921571 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921605 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921698 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-os-release\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921732 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-run-multus-certs\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921761 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55c70eda-8745-4c02-93db-062597d2dbc8-system-cni-dir\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921787 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-etc-openvswitch\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921812 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-system-cni-dir\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921848 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921875 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4bffa53b-77e7-4859-bd19-cd5fae877d65-hosts-file\") pod \"node-resolver-9bxdn\" (UID: \"4bffa53b-77e7-4859-bd19-cd5fae877d65\") " pod="openshift-dns/node-resolver-9bxdn" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921905 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-systemd\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921928 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-etc-kubernetes\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921955 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh6hf\" (UniqueName: \"kubernetes.io/projected/d529fad8-a51c-42d5-bdf1-3abb3ec3e85a-kube-api-access-fh6hf\") pod \"node-ca-9vncl\" (UID: \"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\") " pod="openshift-image-registry/node-ca-9vncl" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921985 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.922015 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-slash\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.922040 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.922066 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovn-node-metrics-cert\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.922089 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h682w\" (UniqueName: \"kubernetes.io/projected/4bffa53b-77e7-4859-bd19-cd5fae877d65-kube-api-access-h682w\") pod \"node-resolver-9bxdn\" (UID: \"4bffa53b-77e7-4859-bd19-cd5fae877d65\") " pod="openshift-dns/node-resolver-9bxdn" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.922113 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d529fad8-a51c-42d5-bdf1-3abb3ec3e85a-host\") pod \"node-ca-9vncl\" (UID: \"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\") " pod="openshift-image-registry/node-ca-9vncl" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.922146 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923343 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-var-lib-kubelet\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923428 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55c70eda-8745-4c02-93db-062597d2dbc8-cnibin\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923466 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55c70eda-8745-4c02-93db-062597d2dbc8-cni-binary-copy\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923520 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923548 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-env-overrides\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923578 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/55c70eda-8745-4c02-93db-062597d2dbc8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923617 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923656 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923690 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923715 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-openvswitch\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923746 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-multus-cni-dir\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923773 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-multus-socket-dir-parent\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923800 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923833 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5508b217-e634-41a8-813a-65ae39d7ea3d-cni-binary-copy\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923812 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923872 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-run-ovn-kubernetes\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923905 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6e9469a-474b-45c6-b3bd-638cb7a2e226-proxy-tls\") pod \"machine-config-daemon-dss5c\" (UID: \"e6e9469a-474b-45c6-b3bd-638cb7a2e226\") " pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923951 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923985 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-var-lib-cni-bin\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924010 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6zh6\" (UniqueName: \"kubernetes.io/projected/5508b217-e634-41a8-813a-65ae39d7ea3d-kube-api-access-z6zh6\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924049 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924077 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-run-netns\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924101 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-var-lib-openvswitch\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924125 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovnkube-script-lib\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924190 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5508b217-e634-41a8-813a-65ae39d7ea3d-multus-daemon-config\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924222 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-kubelet\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924246 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-systemd-units\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924275 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2hpp\" (UniqueName: \"kubernetes.io/projected/ee6653df-cf05-46a7-9187-97bfc3c5b849-kube-api-access-w2hpp\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924304 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-var-lib-cni-multus\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924328 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55c70eda-8745-4c02-93db-062597d2dbc8-os-release\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924360 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e6e9469a-474b-45c6-b3bd-638cb7a2e226-mcd-auth-proxy-config\") pod \"machine-config-daemon-dss5c\" (UID: \"e6e9469a-474b-45c6-b3bd-638cb7a2e226\") " pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924387 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-hostroot\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924431 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924460 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-log-socket\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924488 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-cni-netd\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924520 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-cnibin\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924543 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d529fad8-a51c-42d5-bdf1-3abb3ec3e85a-serviceca\") pod \"node-ca-9vncl\" (UID: \"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\") " pod="openshift-image-registry/node-ca-9vncl" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924573 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgf5b\" (UniqueName: \"kubernetes.io/projected/e6e9469a-474b-45c6-b3bd-638cb7a2e226-kube-api-access-pgf5b\") pod \"machine-config-daemon-dss5c\" (UID: \"e6e9469a-474b-45c6-b3bd-638cb7a2e226\") " pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924599 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-node-log\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924625 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-cni-bin\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924649 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-run-netns\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924682 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924709 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-ovn\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924735 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e6e9469a-474b-45c6-b3bd-638cb7a2e226-rootfs\") pod \"machine-config-daemon-dss5c\" (UID: \"e6e9469a-474b-45c6-b3bd-638cb7a2e226\") " pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924762 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-run-k8s-cni-cncf-io\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924791 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bzq8\" (UniqueName: \"kubernetes.io/projected/55c70eda-8745-4c02-93db-062597d2dbc8-kube-api-access-9bzq8\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924823 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924852 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovnkube-config\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924891 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924917 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-multus-conf-dir\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924944 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55c70eda-8745-4c02-93db-062597d2dbc8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.925065 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.925089 4689 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.925109 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.932589 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.933327 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.944010 4689 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.946157 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.948383 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918289 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918612 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.918935 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919044 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919452 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919625 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919669 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.919707 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920018 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920029 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920257 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920444 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920548 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920812 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920810 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.920874 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921359 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921518 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921669 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921928 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.921976 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.922609 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923031 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.952926 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.923481 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924387 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924408 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924705 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.924713 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.925297 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.925366 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.925822 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.925915 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.926019 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.926938 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.927644 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.927683 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.926962 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.928287 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.928388 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.928734 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.928837 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.928888 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.928897 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.929214 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.929297 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.929501 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.929821 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.929913 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.930024 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.930067 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.930429 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.930682 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.930519 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.930982 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.931104 4689 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.931373 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.931574 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.931671 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.931967 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.932085 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.932314 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.932348 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.932461 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.932729 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.932518 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.932836 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.932910 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.932921 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.933140 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.933202 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.933261 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.933469 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.933498 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.933511 4689 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.933553 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.933604 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.933870 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.934120 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.953557 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.934438 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.934905 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.935335 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.935396 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.935640 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.935648 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.953683 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.936328 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.936371 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.936452 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.936469 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.936721 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.936793 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.936810 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.936855 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.936904 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.937075 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.937120 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.937323 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.937340 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.937440 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.937575 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.937650 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.938011 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.938021 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.938660 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.938687 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.938702 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.939146 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.939299 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.939764 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.940005 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.940015 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.940114 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.940420 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.941430 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.941911 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.941923 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.942243 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.942480 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.943088 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.944694 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.944939 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.945214 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.945788 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.946022 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.946250 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.946600 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.947571 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.947764 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.947777 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.947932 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.948015 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.948464 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.948532 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.948556 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.948922 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.949641 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.949763 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.949815 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.949814 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.949982 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.950011 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.950424 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.950970 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.951207 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.951246 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.951351 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.951637 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.951778 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.951679 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.951933 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.951866 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.952438 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.953410 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.934453 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.953798 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.953909 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:20:59.453875011 +0000 UTC m=+104.500258540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.954574 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.954947 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.955302 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.955584 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.956042 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.956822 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.957087 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.957276 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:20:59.457253411 +0000 UTC m=+104.503636940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.957286 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.957540 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.957566 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.957588 4689 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.957648 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 04:20:59.45763199 +0000 UTC m=+104.504015519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.957785 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:20:59.457764994 +0000 UTC m=+104.504148493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.958125 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.958264 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.958810 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.958984 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.959860 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.959980 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.960004 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.960018 4689 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:20:58 crc kubenswrapper[4689]: E0307 04:20:58.960075 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 04:20:59.460062435 +0000 UTC m=+104.506445934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.957659 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.963740 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.966309 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.966375 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.966679 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.966990 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.967109 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.967292 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.967440 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.967476 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.967515 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.969538 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.969725 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.970324 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.970397 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.970681 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.970356 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.970449 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.970453 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.971445 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.976795 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.978462 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.979347 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.986925 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.986983 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.987002 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.987028 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.987047 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:58Z","lastTransitionTime":"2026-03-07T04:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.988146 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.988284 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.988884 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:20:58 crc kubenswrapper[4689]: I0307 04:20:58.996308 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.002716 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.003474 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.010561 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.022711 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.025755 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.025805 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-openvswitch\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.025832 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-multus-cni-dir\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.025857 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-multus-socket-dir-parent\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.025881 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-run-ovn-kubernetes\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.025904 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5508b217-e634-41a8-813a-65ae39d7ea3d-cni-binary-copy\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.025925 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6e9469a-474b-45c6-b3bd-638cb7a2e226-proxy-tls\") pod \"machine-config-daemon-dss5c\" (UID: \"e6e9469a-474b-45c6-b3bd-638cb7a2e226\") " pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.025957 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-var-lib-cni-bin\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.025982 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6zh6\" (UniqueName: \"kubernetes.io/projected/5508b217-e634-41a8-813a-65ae39d7ea3d-kube-api-access-z6zh6\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026003 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-kubelet\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026027 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-systemd-units\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026050 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-run-netns\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026072 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-var-lib-openvswitch\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026093 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovnkube-script-lib\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026113 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5508b217-e634-41a8-813a-65ae39d7ea3d-multus-daemon-config\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026134 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2hpp\" (UniqueName: \"kubernetes.io/projected/ee6653df-cf05-46a7-9187-97bfc3c5b849-kube-api-access-w2hpp\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026154 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-var-lib-cni-multus\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026193 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55c70eda-8745-4c02-93db-062597d2dbc8-os-release\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026216 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e6e9469a-474b-45c6-b3bd-638cb7a2e226-mcd-auth-proxy-config\") pod \"machine-config-daemon-dss5c\" (UID: \"e6e9469a-474b-45c6-b3bd-638cb7a2e226\") " pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026238 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-hostroot\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026263 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgf5b\" (UniqueName: \"kubernetes.io/projected/e6e9469a-474b-45c6-b3bd-638cb7a2e226-kube-api-access-pgf5b\") pod \"machine-config-daemon-dss5c\" (UID: \"e6e9469a-474b-45c6-b3bd-638cb7a2e226\") " pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026302 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-log-socket\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026332 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-cni-netd\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026354 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-cnibin\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026377 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d529fad8-a51c-42d5-bdf1-3abb3ec3e85a-serviceca\") pod \"node-ca-9vncl\" (UID: \"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\") " pod="openshift-image-registry/node-ca-9vncl" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026400 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-ovn\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026402 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-var-lib-cni-multus\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026459 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-node-log\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026421 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-node-log\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026525 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-cni-bin\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026546 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-run-netns\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026548 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55c70eda-8745-4c02-93db-062597d2dbc8-os-release\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026570 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e6e9469a-474b-45c6-b3bd-638cb7a2e226-rootfs\") pod \"machine-config-daemon-dss5c\" (UID: \"e6e9469a-474b-45c6-b3bd-638cb7a2e226\") " pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026660 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-run-k8s-cni-cncf-io\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026685 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bzq8\" (UniqueName: \"kubernetes.io/projected/55c70eda-8745-4c02-93db-062597d2dbc8-kube-api-access-9bzq8\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026707 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026726 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovnkube-config\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026800 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-openvswitch\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026774 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-multus-cni-dir\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026849 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-multus-conf-dir\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026898 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55c70eda-8745-4c02-93db-062597d2dbc8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026902 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-log-socket\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026919 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-etc-openvswitch\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026939 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-system-cni-dir\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026960 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-os-release\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026981 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-run-multus-certs\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026999 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55c70eda-8745-4c02-93db-062597d2dbc8-system-cni-dir\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027020 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4bffa53b-77e7-4859-bd19-cd5fae877d65-hosts-file\") pod \"node-resolver-9bxdn\" (UID: \"4bffa53b-77e7-4859-bd19-cd5fae877d65\") " pod="openshift-dns/node-resolver-9bxdn" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027045 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027060 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-systemd\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027076 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-etc-kubernetes\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027092 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh6hf\" (UniqueName: \"kubernetes.io/projected/d529fad8-a51c-42d5-bdf1-3abb3ec3e85a-kube-api-access-fh6hf\") pod \"node-ca-9vncl\" (UID: \"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\") " pod="openshift-image-registry/node-ca-9vncl" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027119 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-slash\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027136 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027154 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovn-node-metrics-cert\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027189 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h682w\" (UniqueName: \"kubernetes.io/projected/4bffa53b-77e7-4859-bd19-cd5fae877d65-kube-api-access-h682w\") pod \"node-resolver-9bxdn\" (UID: \"4bffa53b-77e7-4859-bd19-cd5fae877d65\") " pod="openshift-dns/node-resolver-9bxdn" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027206 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d529fad8-a51c-42d5-bdf1-3abb3ec3e85a-host\") pod \"node-ca-9vncl\" (UID: \"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\") " pod="openshift-image-registry/node-ca-9vncl" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027232 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-env-overrides\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027248 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-var-lib-kubelet\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027266 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55c70eda-8745-4c02-93db-062597d2dbc8-cnibin\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027282 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55c70eda-8745-4c02-93db-062597d2dbc8-cni-binary-copy\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027305 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/55c70eda-8745-4c02-93db-062597d2dbc8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027395 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027409 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027419 4689 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027422 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e6e9469a-474b-45c6-b3bd-638cb7a2e226-rootfs\") pod \"machine-config-daemon-dss5c\" (UID: \"e6e9469a-474b-45c6-b3bd-638cb7a2e226\") " pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027429 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027451 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5508b217-e634-41a8-813a-65ae39d7ea3d-multus-daemon-config\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027468 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027497 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-kubelet\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027531 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-cni-bin\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027573 4689 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027578 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-etc-kubernetes\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027604 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-run-netns\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027611 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-multus-conf-dir\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027633 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-systemd-units\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027666 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-var-lib-cni-bin\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027687 4689 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027688 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovnkube-script-lib\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027759 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-run-k8s-cni-cncf-io\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027736 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovnkube-config\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027808 4689 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027823 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-multus-socket-dir-parent\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027843 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-run-ovn-kubernetes\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027875 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-cnibin\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027873 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d529fad8-a51c-42d5-bdf1-3abb3ec3e85a-host\") pod \"node-ca-9vncl\" (UID: \"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\") " pod="openshift-image-registry/node-ca-9vncl" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027890 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027914 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-slash\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027959 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-run-multus-certs\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027973 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.027989 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-etc-openvswitch\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028029 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-system-cni-dir\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.026940 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-hostroot\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028089 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-os-release\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028119 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-ovn\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028154 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4bffa53b-77e7-4859-bd19-cd5fae877d65-hosts-file\") pod \"node-resolver-9bxdn\" (UID: \"4bffa53b-77e7-4859-bd19-cd5fae877d65\") " pod="openshift-dns/node-resolver-9bxdn" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028240 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/55c70eda-8745-4c02-93db-062597d2dbc8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028362 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55c70eda-8745-4c02-93db-062597d2dbc8-system-cni-dir\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028410 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-cni-netd\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028587 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55c70eda-8745-4c02-93db-062597d2dbc8-cnibin\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028630 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5508b217-e634-41a8-813a-65ae39d7ea3d-host-var-lib-kubelet\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028662 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-run-netns\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028716 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028735 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-systemd\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028749 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-env-overrides\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.028802 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-var-lib-openvswitch\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029239 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029261 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029276 4689 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029291 4689 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029302 4689 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029315 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029329 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029344 4689 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029357 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029375 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029388 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029401 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029412 4689 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029424 4689 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029437 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029449 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029460 4689 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029473 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029484 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029501 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029514 4689 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029526 4689 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.029717 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5508b217-e634-41a8-813a-65ae39d7ea3d-cni-binary-copy\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030008 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030029 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030040 4689 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030052 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030064 4689 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030075 4689 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030087 4689 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030098 4689 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030112 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030122 4689 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030133 4689 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030146 4689 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030156 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030180 4689 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030195 4689 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030206 4689 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030216 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030226 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030237 4689 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030247 4689 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030259 4689 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030270 4689 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030281 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030292 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030301 4689 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030311 4689 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030322 4689 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030332 4689 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030341 4689 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030351 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030361 4689 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030371 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030387 4689 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030397 4689 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030407 4689 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030418 4689 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030429 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030438 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030449 4689 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030459 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030469 4689 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030478 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030488 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030496 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030507 4689 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030517 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030528 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030538 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030548 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030558 4689 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030569 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030578 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030591 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030601 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030612 4689 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030623 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030632 4689 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030641 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030651 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030660 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030672 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030683 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030692 4689 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030701 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030710 4689 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030721 4689 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030731 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030740 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030750 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030760 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030769 4689 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030779 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030789 4689 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030799 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030809 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030818 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030829 4689 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030840 4689 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030849 4689 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030880 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030895 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030906 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030915 4689 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030926 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030935 4689 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030945 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030954 4689 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030963 4689 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030973 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030982 4689 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030992 4689 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031012 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031021 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031067 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031077 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031086 4689 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031097 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031107 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031118 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031130 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031141 4689 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031151 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031162 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031193 4689 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031212 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031235 4689 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031248 4689 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031261 4689 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031275 4689 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031288 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031301 4689 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031312 4689 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031326 4689 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031338 4689 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031350 4689 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031364 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031383 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031395 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031407 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031419 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031423 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e6e9469a-474b-45c6-b3bd-638cb7a2e226-mcd-auth-proxy-config\") pod \"machine-config-daemon-dss5c\" (UID: \"e6e9469a-474b-45c6-b3bd-638cb7a2e226\") " pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031432 4689 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031454 4689 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031471 4689 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031485 4689 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031496 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031505 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031515 4689 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031525 4689 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031538 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031553 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031566 4689 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031579 4689 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031594 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031607 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031619 4689 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031628 4689 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031651 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031663 4689 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031675 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031688 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031704 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031721 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031779 4689 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031793 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031806 4689 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031820 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031835 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031849 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031861 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031875 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031889 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031903 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031917 4689 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031933 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031952 4689 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031968 4689 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031980 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.031993 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.032007 4689 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.032020 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.030070 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55c70eda-8745-4c02-93db-062597d2dbc8-cni-binary-copy\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.032628 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovn-node-metrics-cert\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.039138 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6e9469a-474b-45c6-b3bd-638cb7a2e226-proxy-tls\") pod \"machine-config-daemon-dss5c\" (UID: \"e6e9469a-474b-45c6-b3bd-638cb7a2e226\") " pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.044338 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55c70eda-8745-4c02-93db-062597d2dbc8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.041144 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.048555 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d529fad8-a51c-42d5-bdf1-3abb3ec3e85a-serviceca\") pod \"node-ca-9vncl\" (UID: \"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\") " pod="openshift-image-registry/node-ca-9vncl" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.052461 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bzq8\" (UniqueName: \"kubernetes.io/projected/55c70eda-8745-4c02-93db-062597d2dbc8-kube-api-access-9bzq8\") pod \"multus-additional-cni-plugins-gjvmk\" (UID: \"55c70eda-8745-4c02-93db-062597d2dbc8\") " pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.052726 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh6hf\" (UniqueName: \"kubernetes.io/projected/d529fad8-a51c-42d5-bdf1-3abb3ec3e85a-kube-api-access-fh6hf\") pod \"node-ca-9vncl\" (UID: \"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\") " pod="openshift-image-registry/node-ca-9vncl" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.052905 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgf5b\" (UniqueName: \"kubernetes.io/projected/e6e9469a-474b-45c6-b3bd-638cb7a2e226-kube-api-access-pgf5b\") pod \"machine-config-daemon-dss5c\" (UID: \"e6e9469a-474b-45c6-b3bd-638cb7a2e226\") " pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.053336 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6zh6\" (UniqueName: \"kubernetes.io/projected/5508b217-e634-41a8-813a-65ae39d7ea3d-kube-api-access-z6zh6\") pod \"multus-wmhqx\" (UID: \"5508b217-e634-41a8-813a-65ae39d7ea3d\") " pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.054424 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h682w\" (UniqueName: \"kubernetes.io/projected/4bffa53b-77e7-4859-bd19-cd5fae877d65-kube-api-access-h682w\") pod \"node-resolver-9bxdn\" (UID: \"4bffa53b-77e7-4859-bd19-cd5fae877d65\") " pod="openshift-dns/node-resolver-9bxdn" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.058355 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2hpp\" (UniqueName: \"kubernetes.io/projected/ee6653df-cf05-46a7-9187-97bfc3c5b849-kube-api-access-w2hpp\") pod \"ovnkube-node-j9bx5\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.061658 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.089648 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.089705 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.089718 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.089740 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.089753 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:59Z","lastTransitionTime":"2026-03-07T04:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.152581 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.168739 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.178562 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 04:20:59 crc kubenswrapper[4689]: W0307 04:20:59.186217 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-e235164fbc672592a688a09406301c80dab38ae149ab88aaaf2f799cfed9ba65 WatchSource:0}: Error finding container e235164fbc672592a688a09406301c80dab38ae149ab88aaaf2f799cfed9ba65: Status 404 returned error can't find the container with id e235164fbc672592a688a09406301c80dab38ae149ab88aaaf2f799cfed9ba65 Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.188526 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.192660 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.192698 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.192709 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.192728 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.192740 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:59Z","lastTransitionTime":"2026-03-07T04:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.194810 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.207380 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9bxdn" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.225107 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wmhqx" Mar 07 04:20:59 crc kubenswrapper[4689]: W0307 04:20:59.244586 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6e9469a_474b_45c6_b3bd_638cb7a2e226.slice/crio-61a676c9fdaac757179db927833a6ca6a8b486bd0ea5cfd9e255e79ff1498a95 WatchSource:0}: Error finding container 61a676c9fdaac757179db927833a6ca6a8b486bd0ea5cfd9e255e79ff1498a95: Status 404 returned error can't find the container with id 61a676c9fdaac757179db927833a6ca6a8b486bd0ea5cfd9e255e79ff1498a95 Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.247937 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e235164fbc672592a688a09406301c80dab38ae149ab88aaaf2f799cfed9ba65"} Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.249961 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a518bee011f8992ae3e7e39bf8b8e035d4a730159c44ab28eb0b954286578a6e"} Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.251381 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"24b615a2fc99211a53542a18be53ab322d8446368b6de278935d4fb29fe924c4"} Mar 07 04:20:59 crc kubenswrapper[4689]: W0307 04:20:59.251753 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c70eda_8745_4c02_93db_062597d2dbc8.slice/crio-b4c2d93aeef889db52f8ca073724708426c62105181e995ecc70c065e51f862a WatchSource:0}: Error finding container b4c2d93aeef889db52f8ca073724708426c62105181e995ecc70c065e51f862a: Status 404 returned error can't find the container with id b4c2d93aeef889db52f8ca073724708426c62105181e995ecc70c065e51f862a Mar 07 04:20:59 crc kubenswrapper[4689]: W0307 04:20:59.256920 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bffa53b_77e7_4859_bd19_cd5fae877d65.slice/crio-3fea2ea4df4a1639334a3d5a72d152595854ad49ab0402a7e0cb14294760a233 WatchSource:0}: Error finding container 3fea2ea4df4a1639334a3d5a72d152595854ad49ab0402a7e0cb14294760a233: Status 404 returned error can't find the container with id 3fea2ea4df4a1639334a3d5a72d152595854ad49ab0402a7e0cb14294760a233 Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.285371 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.294259 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9vncl" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.296807 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.296888 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.296906 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.296940 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.296958 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:59Z","lastTransitionTime":"2026-03-07T04:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:59 crc kubenswrapper[4689]: W0307 04:20:59.315562 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5508b217_e634_41a8_813a_65ae39d7ea3d.slice/crio-6eb9e49a4aac5b9d168971e22314c14936dd1fa4219e131f6f26b7b8c87d4d23 WatchSource:0}: Error finding container 6eb9e49a4aac5b9d168971e22314c14936dd1fa4219e131f6f26b7b8c87d4d23: Status 404 returned error can't find the container with id 6eb9e49a4aac5b9d168971e22314c14936dd1fa4219e131f6f26b7b8c87d4d23 Mar 07 04:20:59 crc kubenswrapper[4689]: W0307 04:20:59.342131 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd529fad8_a51c_42d5_bdf1_3abb3ec3e85a.slice/crio-a5713ba23eaea94d44a6d68209e2ac2c330fe145ffc9d06e897bb0e1484dba9f WatchSource:0}: Error finding container a5713ba23eaea94d44a6d68209e2ac2c330fe145ffc9d06e897bb0e1484dba9f: Status 404 returned error can't find the container with id a5713ba23eaea94d44a6d68209e2ac2c330fe145ffc9d06e897bb0e1484dba9f Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.399766 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.399817 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.399832 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.399854 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.399868 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:59Z","lastTransitionTime":"2026-03-07T04:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:59 crc kubenswrapper[4689]: W0307 04:20:59.403585 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee6653df_cf05_46a7_9187_97bfc3c5b849.slice/crio-3408db5fde5a5ed80ae1d3f2519603aa7e40a80a9d55203bf2a14ff02fcb4159 WatchSource:0}: Error finding container 3408db5fde5a5ed80ae1d3f2519603aa7e40a80a9d55203bf2a14ff02fcb4159: Status 404 returned error can't find the container with id 3408db5fde5a5ed80ae1d3f2519603aa7e40a80a9d55203bf2a14ff02fcb4159 Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.503659 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.504233 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.504247 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.504275 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.504290 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:59Z","lastTransitionTime":"2026-03-07T04:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.538799 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.538969 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.539015 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.539051 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.539091 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539305 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539329 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539346 4689 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539413 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:00.539392816 +0000 UTC m=+105.585776325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539429 4689 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539472 4689 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539510 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:00.539484588 +0000 UTC m=+105.585868097 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539534 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:00.539523669 +0000 UTC m=+105.585907178 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539548 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539562 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539572 4689 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539600 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:00.539591141 +0000 UTC m=+105.585974640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:20:59 crc kubenswrapper[4689]: E0307 04:20:59.539620 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:21:00.539610461 +0000 UTC m=+105.585993960 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.606324 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.606387 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.606406 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.606430 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.606444 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:59Z","lastTransitionTime":"2026-03-07T04:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.709996 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.710034 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.710045 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.710064 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.710076 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:59Z","lastTransitionTime":"2026-03-07T04:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.813118 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.813158 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.813182 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.813210 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.813219 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:59Z","lastTransitionTime":"2026-03-07T04:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.830039 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.830818 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.831659 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.833446 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.834422 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.835028 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.835705 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.836367 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.837063 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.837635 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.838218 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.838928 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.839518 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.840064 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.840652 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.842389 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.843522 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.844219 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.844853 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.845511 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.846124 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.846774 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.847507 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.848332 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.848835 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.849638 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.850501 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.851034 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.853714 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.854757 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.855702 4689 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.855840 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.857346 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.857866 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.858442 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.859731 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.860455 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.861040 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.863651 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.864347 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.865246 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.865857 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.866821 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.867480 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.868366 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.868905 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.869817 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.870569 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.871497 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.871999 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.872525 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.873432 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.873988 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.874842 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.915816 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.915879 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.915893 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.915920 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:20:59 crc kubenswrapper[4689]: I0307 04:20:59.915936 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:20:59Z","lastTransitionTime":"2026-03-07T04:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.019040 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.019084 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.019100 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.019120 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.019134 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:00Z","lastTransitionTime":"2026-03-07T04:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.122754 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.122831 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.122853 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.122882 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.122900 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:00Z","lastTransitionTime":"2026-03-07T04:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.226426 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.226500 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.226524 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.226557 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.226582 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:00Z","lastTransitionTime":"2026-03-07T04:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.256585 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9bxdn" event={"ID":"4bffa53b-77e7-4859-bd19-cd5fae877d65","Type":"ContainerStarted","Data":"d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.256975 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9bxdn" event={"ID":"4bffa53b-77e7-4859-bd19-cd5fae877d65","Type":"ContainerStarted","Data":"3fea2ea4df4a1639334a3d5a72d152595854ad49ab0402a7e0cb14294760a233"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.258775 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" event={"ID":"55c70eda-8745-4c02-93db-062597d2dbc8","Type":"ContainerStarted","Data":"76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.259010 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" event={"ID":"55c70eda-8745-4c02-93db-062597d2dbc8","Type":"ContainerStarted","Data":"b4c2d93aeef889db52f8ca073724708426c62105181e995ecc70c065e51f862a"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.261967 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerStarted","Data":"586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.262263 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerStarted","Data":"75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.262298 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerStarted","Data":"61a676c9fdaac757179db927833a6ca6a8b486bd0ea5cfd9e255e79ff1498a95"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.264468 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.267121 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.267332 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.269662 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816" exitCode=0 Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.269756 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.270103 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"3408db5fde5a5ed80ae1d3f2519603aa7e40a80a9d55203bf2a14ff02fcb4159"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.273687 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9vncl" event={"ID":"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a","Type":"ContainerStarted","Data":"c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.273748 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9vncl" event={"ID":"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a","Type":"ContainerStarted","Data":"a5713ba23eaea94d44a6d68209e2ac2c330fe145ffc9d06e897bb0e1484dba9f"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.280736 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wmhqx" event={"ID":"5508b217-e634-41a8-813a-65ae39d7ea3d","Type":"ContainerStarted","Data":"733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.280942 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wmhqx" event={"ID":"5508b217-e634-41a8-813a-65ae39d7ea3d","Type":"ContainerStarted","Data":"6eb9e49a4aac5b9d168971e22314c14936dd1fa4219e131f6f26b7b8c87d4d23"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.280782 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.295565 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.312810 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.325083 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.328983 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.329022 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.329032 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.329050 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.329061 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:00Z","lastTransitionTime":"2026-03-07T04:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.342407 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.361108 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.379053 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.410314 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.421511 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.433325 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.433498 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.433537 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.433676 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.433710 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:00Z","lastTransitionTime":"2026-03-07T04:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.439076 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.458664 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.471568 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.484624 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.496749 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.513396 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.536877 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.536927 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.536940 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.536960 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.536973 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:00Z","lastTransitionTime":"2026-03-07T04:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.539461 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.551212 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.551315 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.551367 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551387 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:21:02.551364519 +0000 UTC m=+107.597748008 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.551414 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.551454 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551499 4689 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551538 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551542 4689 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551557 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551567 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:02.551553704 +0000 UTC m=+107.597937193 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551574 4689 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551590 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:02.551581146 +0000 UTC m=+107.597964635 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551616 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:02.551605346 +0000 UTC m=+107.597988835 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551745 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551800 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551823 4689 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.551918 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:02.551890044 +0000 UTC m=+107.598273563 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.555461 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.568816 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.587308 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.600707 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.615829 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.627509 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.639533 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.639632 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.639652 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.639681 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.639700 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:00Z","lastTransitionTime":"2026-03-07T04:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.643118 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.654813 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.743315 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.743386 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.743407 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.743438 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.743457 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:00Z","lastTransitionTime":"2026-03-07T04:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.825429 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.825474 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.825639 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.825928 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.825968 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:00 crc kubenswrapper[4689]: E0307 04:21:00.826514 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.846731 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.846778 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.846790 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.846809 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.846820 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:00Z","lastTransitionTime":"2026-03-07T04:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.951304 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.951402 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.951427 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.951462 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:00 crc kubenswrapper[4689]: I0307 04:21:00.951495 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:00Z","lastTransitionTime":"2026-03-07T04:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.055063 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.055150 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.055212 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.055249 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.055295 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:01Z","lastTransitionTime":"2026-03-07T04:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.159044 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.159111 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.159130 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.159156 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.159202 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:01Z","lastTransitionTime":"2026-03-07T04:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.261968 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.262031 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.262045 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.262071 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.262088 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:01Z","lastTransitionTime":"2026-03-07T04:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.289957 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.290041 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.290069 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.293960 4689 generic.go:334] "Generic (PLEG): container finished" podID="55c70eda-8745-4c02-93db-062597d2dbc8" containerID="76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61" exitCode=0 Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.294061 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" event={"ID":"55c70eda-8745-4c02-93db-062597d2dbc8","Type":"ContainerDied","Data":"76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.309493 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.325540 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.336600 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.348736 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.360507 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.365566 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.365604 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.365616 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.365634 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.365647 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:01Z","lastTransitionTime":"2026-03-07T04:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.376961 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:01Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.399830 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:01Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.417687 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:01Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.432674 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:01Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.453190 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:01Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.467921 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:01Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.468126 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.468214 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.468225 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.468270 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.468283 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:01Z","lastTransitionTime":"2026-03-07T04:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.480142 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:01Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.572734 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.572767 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.572782 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.572802 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.572816 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:01Z","lastTransitionTime":"2026-03-07T04:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.682418 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.682529 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.682561 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.682600 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.682628 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:01Z","lastTransitionTime":"2026-03-07T04:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.785833 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.785875 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.785889 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.785910 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.785924 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:01Z","lastTransitionTime":"2026-03-07T04:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.889222 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.889275 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.889289 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.889311 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.889325 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:01Z","lastTransitionTime":"2026-03-07T04:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.993047 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.993101 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.993113 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.993143 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:01 crc kubenswrapper[4689]: I0307 04:21:01.993157 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:01Z","lastTransitionTime":"2026-03-07T04:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.096410 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.096469 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.096499 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.096517 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.096531 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:02Z","lastTransitionTime":"2026-03-07T04:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.200208 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.200288 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.200307 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.200340 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.200363 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:02Z","lastTransitionTime":"2026-03-07T04:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.303028 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.303066 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.303075 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.303094 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.303105 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:02Z","lastTransitionTime":"2026-03-07T04:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.303640 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" event={"ID":"55c70eda-8745-4c02-93db-062597d2dbc8","Type":"ContainerStarted","Data":"389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.309125 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.309162 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.309189 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.328401 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:02Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.358505 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:02Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.384597 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:02Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.400904 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:02Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.406438 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.406484 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.406497 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.406524 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.406537 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:02Z","lastTransitionTime":"2026-03-07T04:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.425697 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:02Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.444564 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:02Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.467582 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:02Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.483309 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:02Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.500364 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:02Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.511838 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.512248 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.512265 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.512288 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.512301 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:02Z","lastTransitionTime":"2026-03-07T04:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.516256 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:02Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.533196 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:02Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.555097 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:02Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.574500 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.574638 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.574675 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.574707 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.574747 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.574923 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.574946 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.574962 4689 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.575015 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:06.57499501 +0000 UTC m=+111.621378509 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.575454 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:21:06.575434511 +0000 UTC m=+111.621818020 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.575497 4689 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.575497 4689 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.575547 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:06.575536274 +0000 UTC m=+111.621919773 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.575631 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:06.575604006 +0000 UTC m=+111.621987505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.575642 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.575675 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.575727 4689 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.575799 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:06.57578053 +0000 UTC m=+111.622164259 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.615838 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.615902 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.615921 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.615953 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.615977 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:02Z","lastTransitionTime":"2026-03-07T04:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.718900 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.718947 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.718956 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.718974 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.718985 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:02Z","lastTransitionTime":"2026-03-07T04:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.822392 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.822447 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.822459 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.822485 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.822501 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:02Z","lastTransitionTime":"2026-03-07T04:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.824784 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.824784 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.824917 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.824947 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.825288 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:02 crc kubenswrapper[4689]: E0307 04:21:02.825331 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.925619 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.925671 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.925680 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.925702 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:02 crc kubenswrapper[4689]: I0307 04:21:02.925716 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:02Z","lastTransitionTime":"2026-03-07T04:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.028718 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.028796 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.028805 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.028832 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.028843 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.131048 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.131100 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.131111 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.131131 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.131147 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.234449 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.234521 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.234550 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.234581 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.234604 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.316327 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.320107 4689 generic.go:334] "Generic (PLEG): container finished" podID="55c70eda-8745-4c02-93db-062597d2dbc8" containerID="389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b" exitCode=0 Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.320219 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" event={"ID":"55c70eda-8745-4c02-93db-062597d2dbc8","Type":"ContainerDied","Data":"389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.338107 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.338285 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.338330 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.338345 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.338366 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.338424 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.356392 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.379336 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.403092 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.424701 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.441320 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.442347 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.442414 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.442443 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.442470 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.442484 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.470107 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.485869 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.507980 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.527294 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.545211 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.545265 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.545277 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.545295 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.545340 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.547472 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.567129 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.586002 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.606489 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.627511 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.643253 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.649159 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.649207 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.649217 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.649236 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.649247 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.660609 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.680779 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.699581 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.714507 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.728456 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.751873 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.751955 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.751969 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.751994 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.752010 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.758732 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.758774 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.758786 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.758801 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.758811 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.763885 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.784399 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: E0307 04:21:03.796709 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.801736 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.801773 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.801782 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.801815 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.801826 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: E0307 04:21:03.822705 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.829702 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.829739 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.829750 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.829766 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.829776 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.839286 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: E0307 04:21:03.853005 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.858834 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.858883 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.858896 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.858913 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.858923 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: E0307 04:21:03.879991 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.885925 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.885953 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.885962 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.885975 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.885984 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:03 crc kubenswrapper[4689]: E0307 04:21:03.898841 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:03Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:03 crc kubenswrapper[4689]: E0307 04:21:03.898967 4689 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.901583 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.901612 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.901623 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.901640 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:03 crc kubenswrapper[4689]: I0307 04:21:03.901651 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:03Z","lastTransitionTime":"2026-03-07T04:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.004066 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.004434 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.004443 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.004458 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.004469 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:04Z","lastTransitionTime":"2026-03-07T04:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.108010 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.108059 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.108070 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.108085 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.108097 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:04Z","lastTransitionTime":"2026-03-07T04:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.210812 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.210850 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.210860 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.210876 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.210887 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:04Z","lastTransitionTime":"2026-03-07T04:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.315485 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.315520 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.315530 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.315544 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.315554 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:04Z","lastTransitionTime":"2026-03-07T04:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.320615 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf"] Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.321207 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.324060 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.330048 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.338965 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d"} Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.342354 4689 generic.go:334] "Generic (PLEG): container finished" podID="55c70eda-8745-4c02-93db-062597d2dbc8" containerID="ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499" exitCode=0 Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.342481 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" event={"ID":"55c70eda-8745-4c02-93db-062597d2dbc8","Type":"ContainerDied","Data":"ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499"} Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.350015 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.362960 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.380192 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.393851 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.409023 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.421182 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.421223 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.421236 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.421256 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.421280 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:04Z","lastTransitionTime":"2026-03-07T04:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.429128 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.444235 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.460761 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.477411 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.493043 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.496695 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11a8fe40-7781-4819-bb57-f52325e9fcc8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mxsgf\" (UID: \"11a8fe40-7781-4819-bb57-f52325e9fcc8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.496750 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11a8fe40-7781-4819-bb57-f52325e9fcc8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mxsgf\" (UID: \"11a8fe40-7781-4819-bb57-f52325e9fcc8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.496803 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11a8fe40-7781-4819-bb57-f52325e9fcc8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mxsgf\" (UID: \"11a8fe40-7781-4819-bb57-f52325e9fcc8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.496843 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdpsx\" (UniqueName: \"kubernetes.io/projected/11a8fe40-7781-4819-bb57-f52325e9fcc8-kube-api-access-hdpsx\") pod \"ovnkube-control-plane-749d76644c-mxsgf\" (UID: \"11a8fe40-7781-4819-bb57-f52325e9fcc8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.509305 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.524086 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.524131 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.524140 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.524157 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.524200 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:04Z","lastTransitionTime":"2026-03-07T04:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.540134 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.555227 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.567141 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.579774 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.596033 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.597268 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdpsx\" (UniqueName: \"kubernetes.io/projected/11a8fe40-7781-4819-bb57-f52325e9fcc8-kube-api-access-hdpsx\") pod \"ovnkube-control-plane-749d76644c-mxsgf\" (UID: \"11a8fe40-7781-4819-bb57-f52325e9fcc8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.597338 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11a8fe40-7781-4819-bb57-f52325e9fcc8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mxsgf\" (UID: \"11a8fe40-7781-4819-bb57-f52325e9fcc8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.597356 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11a8fe40-7781-4819-bb57-f52325e9fcc8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mxsgf\" (UID: \"11a8fe40-7781-4819-bb57-f52325e9fcc8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.597388 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11a8fe40-7781-4819-bb57-f52325e9fcc8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mxsgf\" (UID: \"11a8fe40-7781-4819-bb57-f52325e9fcc8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.598478 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11a8fe40-7781-4819-bb57-f52325e9fcc8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mxsgf\" (UID: \"11a8fe40-7781-4819-bb57-f52325e9fcc8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.598806 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11a8fe40-7781-4819-bb57-f52325e9fcc8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mxsgf\" (UID: \"11a8fe40-7781-4819-bb57-f52325e9fcc8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.607862 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.609224 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11a8fe40-7781-4819-bb57-f52325e9fcc8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mxsgf\" (UID: \"11a8fe40-7781-4819-bb57-f52325e9fcc8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.622613 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdpsx\" (UniqueName: \"kubernetes.io/projected/11a8fe40-7781-4819-bb57-f52325e9fcc8-kube-api-access-hdpsx\") pod \"ovnkube-control-plane-749d76644c-mxsgf\" (UID: \"11a8fe40-7781-4819-bb57-f52325e9fcc8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.624503 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.627964 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.628023 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.628041 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.628068 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.628086 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:04Z","lastTransitionTime":"2026-03-07T04:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.638909 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.641928 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.651237 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: W0307 04:21:04.661487 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11a8fe40_7781_4819_bb57_f52325e9fcc8.slice/crio-d8a20a729fa84f9fc972d9a87760ec4abb840fe9b81caeb5ac89b4efec63c19c WatchSource:0}: Error finding container d8a20a729fa84f9fc972d9a87760ec4abb840fe9b81caeb5ac89b4efec63c19c: Status 404 returned error can't find the container with id d8a20a729fa84f9fc972d9a87760ec4abb840fe9b81caeb5ac89b4efec63c19c Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.664728 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.678500 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.692200 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.721607 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.736130 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.736202 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.736223 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.736250 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.736266 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:04Z","lastTransitionTime":"2026-03-07T04:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.739142 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.754961 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:04Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.825504 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:04 crc kubenswrapper[4689]: E0307 04:21:04.825637 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.826018 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.826056 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:04 crc kubenswrapper[4689]: E0307 04:21:04.826104 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:04 crc kubenswrapper[4689]: E0307 04:21:04.826146 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.838112 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.838150 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.838159 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.838195 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.838210 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:04Z","lastTransitionTime":"2026-03-07T04:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.840695 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.841134 4689 scope.go:117] "RemoveContainer" containerID="504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.940571 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.940622 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.940640 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.940659 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:04 crc kubenswrapper[4689]: I0307 04:21:04.940673 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:04Z","lastTransitionTime":"2026-03-07T04:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.043421 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.043473 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.043483 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.043503 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.043517 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:05Z","lastTransitionTime":"2026-03-07T04:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.056940 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-95vzv"] Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.057747 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:05 crc kubenswrapper[4689]: E0307 04:21:05.057859 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.075328 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.089116 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.112299 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.130024 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.148457 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.148521 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.148536 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.148561 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.148575 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:05Z","lastTransitionTime":"2026-03-07T04:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.152875 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.179668 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.193826 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.203527 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.203724 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trdmb\" (UniqueName: \"kubernetes.io/projected/16e0e2e8-673a-446e-b377-f30ffd8edd1f-kube-api-access-trdmb\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.217565 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.237745 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.251219 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.251281 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.251291 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.251312 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.251323 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:05Z","lastTransitionTime":"2026-03-07T04:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.256633 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.275380 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.293826 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.305113 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trdmb\" (UniqueName: \"kubernetes.io/projected/16e0e2e8-673a-446e-b377-f30ffd8edd1f-kube-api-access-trdmb\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.305237 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:05 crc kubenswrapper[4689]: E0307 04:21:05.305432 4689 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:05 crc kubenswrapper[4689]: E0307 04:21:05.305513 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs podName:16e0e2e8-673a-446e-b377-f30ffd8edd1f nodeName:}" failed. No retries permitted until 2026-03-07 04:21:05.805488801 +0000 UTC m=+110.851872330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs") pod "network-metrics-daemon-95vzv" (UID: "16e0e2e8-673a-446e-b377-f30ffd8edd1f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.310539 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.337827 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trdmb\" (UniqueName: \"kubernetes.io/projected/16e0e2e8-673a-446e-b377-f30ffd8edd1f-kube-api-access-trdmb\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.339095 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.347648 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" event={"ID":"11a8fe40-7781-4819-bb57-f52325e9fcc8","Type":"ContainerStarted","Data":"d8a20a729fa84f9fc972d9a87760ec4abb840fe9b81caeb5ac89b4efec63c19c"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.351263 4689 generic.go:334] "Generic (PLEG): container finished" podID="55c70eda-8745-4c02-93db-062597d2dbc8" containerID="79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a" exitCode=0 Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.351310 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" event={"ID":"55c70eda-8745-4c02-93db-062597d2dbc8","Type":"ContainerDied","Data":"79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.355299 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.355352 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.355370 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.355393 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.355411 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:05Z","lastTransitionTime":"2026-03-07T04:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.380558 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.405406 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.427505 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.447214 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.458419 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.458488 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.458509 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.458535 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.458554 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:05Z","lastTransitionTime":"2026-03-07T04:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.471003 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.490218 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.518222 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.549377 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.562074 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.562117 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.562128 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.562151 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.562183 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:05Z","lastTransitionTime":"2026-03-07T04:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.569344 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.588761 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.607470 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.631041 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.654197 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.664988 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.665048 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.665066 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.665090 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.665103 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:05Z","lastTransitionTime":"2026-03-07T04:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.680693 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.708409 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.732197 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.767515 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.767567 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.767585 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.767611 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.767632 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:05Z","lastTransitionTime":"2026-03-07T04:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.810891 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:05 crc kubenswrapper[4689]: E0307 04:21:05.811102 4689 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:05 crc kubenswrapper[4689]: E0307 04:21:05.811211 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs podName:16e0e2e8-673a-446e-b377-f30ffd8edd1f nodeName:}" failed. No retries permitted until 2026-03-07 04:21:06.811157295 +0000 UTC m=+111.857540814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs") pod "network-metrics-daemon-95vzv" (UID: "16e0e2e8-673a-446e-b377-f30ffd8edd1f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.844860 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.857791 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.870114 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.870208 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.870227 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.870254 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.870275 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:05Z","lastTransitionTime":"2026-03-07T04:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.872996 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.890983 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.902087 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.923503 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.938756 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.966798 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.972665 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.972692 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.972699 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.972712 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.972721 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:05Z","lastTransitionTime":"2026-03-07T04:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:05 crc kubenswrapper[4689]: I0307 04:21:05.986969 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.006513 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.020764 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.033394 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.046477 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.058042 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.068421 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.075583 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.075649 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.075673 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.075700 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.075719 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:06Z","lastTransitionTime":"2026-03-07T04:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.179845 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.179915 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.179934 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.179966 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.179981 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:06Z","lastTransitionTime":"2026-03-07T04:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.283039 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.283122 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.283156 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.283264 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.283299 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:06Z","lastTransitionTime":"2026-03-07T04:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.359521 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" event={"ID":"11a8fe40-7781-4819-bb57-f52325e9fcc8","Type":"ContainerStarted","Data":"882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.359581 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" event={"ID":"11a8fe40-7781-4819-bb57-f52325e9fcc8","Type":"ContainerStarted","Data":"58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.363454 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" event={"ID":"55c70eda-8745-4c02-93db-062597d2dbc8","Type":"ContainerStarted","Data":"0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.368065 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.369654 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.370914 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.382378 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.386705 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.386781 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.386794 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.386814 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.386828 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:06Z","lastTransitionTime":"2026-03-07T04:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.395307 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.407625 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.420346 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.447576 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.464327 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.489440 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.489499 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.489514 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.489539 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.489561 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:06Z","lastTransitionTime":"2026-03-07T04:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.526558 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.567870 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.586642 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.592156 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.592215 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.592226 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.592244 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.592257 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:06Z","lastTransitionTime":"2026-03-07T04:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.601435 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.613364 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.625556 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.627864 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.628000 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.628040 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.628069 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628108 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:21:14.628070284 +0000 UTC m=+119.674453773 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628141 4689 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.628185 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628203 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:14.628188877 +0000 UTC m=+119.674572366 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628250 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628290 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628294 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628319 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628334 4689 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628365 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:14.628355122 +0000 UTC m=+119.674738611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628308 4689 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628398 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:14.628392453 +0000 UTC m=+119.674775942 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628250 4689 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.628429 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:14.628424204 +0000 UTC m=+119.674807693 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.639744 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.651729 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.668079 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.688204 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.695256 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.695299 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.695310 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.695327 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.695339 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:06Z","lastTransitionTime":"2026-03-07T04:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.706372 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.722496 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.736280 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.746421 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.760259 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.771277 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.786250 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.797699 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.797722 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.797730 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.797745 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.797754 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:06Z","lastTransitionTime":"2026-03-07T04:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.809538 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.825131 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.825193 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.825137 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.825137 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.825281 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.825367 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.825429 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.825498 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.825833 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.830191 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.830315 4689 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:06 crc kubenswrapper[4689]: E0307 04:21:06.830362 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs podName:16e0e2e8-673a-446e-b377-f30ffd8edd1f nodeName:}" failed. No retries permitted until 2026-03-07 04:21:08.83034744 +0000 UTC m=+113.876730929 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs") pod "network-metrics-daemon-95vzv" (UID: "16e0e2e8-673a-446e-b377-f30ffd8edd1f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.845722 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.860963 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.874717 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.883782 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.900716 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.900749 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.900759 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.900773 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.900782 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:06Z","lastTransitionTime":"2026-03-07T04:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:06 crc kubenswrapper[4689]: I0307 04:21:06.900891 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.004154 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.004226 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.004235 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.004255 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.004265 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:07Z","lastTransitionTime":"2026-03-07T04:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.107677 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.107756 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.107782 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.107819 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.107847 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:07Z","lastTransitionTime":"2026-03-07T04:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.211697 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.211761 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.211779 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.211806 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.211824 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:07Z","lastTransitionTime":"2026-03-07T04:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.315380 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.315446 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.315464 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.315487 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.315505 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:07Z","lastTransitionTime":"2026-03-07T04:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.380057 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a"} Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.380650 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.382270 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.382322 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.389980 4689 generic.go:334] "Generic (PLEG): container finished" podID="55c70eda-8745-4c02-93db-062597d2dbc8" containerID="0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f" exitCode=0 Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.391494 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" event={"ID":"55c70eda-8745-4c02-93db-062597d2dbc8","Type":"ContainerDied","Data":"0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f"} Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.405596 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.429135 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.431501 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.431553 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.431572 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.431601 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.431621 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:07Z","lastTransitionTime":"2026-03-07T04:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.443089 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.445878 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.472979 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.491229 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.510814 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.525963 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.534493 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.534548 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.534566 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.534591 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.534608 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:07Z","lastTransitionTime":"2026-03-07T04:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.549952 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.562560 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.580120 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.596502 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.613319 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.635481 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.638574 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.638633 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.638652 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.638681 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.638702 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:07Z","lastTransitionTime":"2026-03-07T04:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.654710 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.674741 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.692663 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.707491 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.726162 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.743007 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.743070 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.743087 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.743113 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.743130 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:07Z","lastTransitionTime":"2026-03-07T04:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.747619 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.767203 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.791233 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.807763 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.829289 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.846137 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.846645 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.846738 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.846853 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.846942 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:07Z","lastTransitionTime":"2026-03-07T04:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.849812 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.868096 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.891754 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.908072 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.926909 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.941895 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.950394 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.950425 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.950436 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.950454 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.950467 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:07Z","lastTransitionTime":"2026-03-07T04:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.955684 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:07 crc kubenswrapper[4689]: I0307 04:21:07.979423 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:07Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.053115 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.053160 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.053215 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.053246 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.053269 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:08Z","lastTransitionTime":"2026-03-07T04:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.156493 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.156552 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.156575 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.156601 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.156622 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:08Z","lastTransitionTime":"2026-03-07T04:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.260136 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.260243 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.260261 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.260284 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.260303 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:08Z","lastTransitionTime":"2026-03-07T04:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.363847 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.363892 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.363909 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.363932 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.363949 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:08Z","lastTransitionTime":"2026-03-07T04:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.399145 4689 generic.go:334] "Generic (PLEG): container finished" podID="55c70eda-8745-4c02-93db-062597d2dbc8" containerID="ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22" exitCode=0 Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.401005 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" event={"ID":"55c70eda-8745-4c02-93db-062597d2dbc8","Type":"ContainerDied","Data":"ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22"} Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.440445 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.468565 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.471637 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.471669 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.471680 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.471702 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.471716 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:08Z","lastTransitionTime":"2026-03-07T04:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.487906 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.518550 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.547297 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.575712 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.575743 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.575753 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.575768 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.575780 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:08Z","lastTransitionTime":"2026-03-07T04:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.580037 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.598214 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.609049 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.621620 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.636702 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.650417 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.661743 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.675538 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.678675 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.678707 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.678719 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.678738 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.678751 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:08Z","lastTransitionTime":"2026-03-07T04:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.693636 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.704433 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:08Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.786869 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.786943 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.786960 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.787348 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.787392 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:08Z","lastTransitionTime":"2026-03-07T04:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.825240 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.825325 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.825441 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:08 crc kubenswrapper[4689]: E0307 04:21:08.825429 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:08 crc kubenswrapper[4689]: E0307 04:21:08.825615 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.825250 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:08 crc kubenswrapper[4689]: E0307 04:21:08.825726 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:08 crc kubenswrapper[4689]: E0307 04:21:08.825806 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.871669 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:08 crc kubenswrapper[4689]: E0307 04:21:08.871835 4689 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:08 crc kubenswrapper[4689]: E0307 04:21:08.871899 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs podName:16e0e2e8-673a-446e-b377-f30ffd8edd1f nodeName:}" failed. No retries permitted until 2026-03-07 04:21:12.871881304 +0000 UTC m=+117.918264793 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs") pod "network-metrics-daemon-95vzv" (UID: "16e0e2e8-673a-446e-b377-f30ffd8edd1f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.889562 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.889592 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.889600 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.889613 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.889623 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:08Z","lastTransitionTime":"2026-03-07T04:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.993140 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.993223 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.993236 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.993256 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:08 crc kubenswrapper[4689]: I0307 04:21:08.993267 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:08Z","lastTransitionTime":"2026-03-07T04:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.096907 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.096962 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.096974 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.096996 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.097009 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:09Z","lastTransitionTime":"2026-03-07T04:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.200095 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.200144 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.200153 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.200183 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.200192 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:09Z","lastTransitionTime":"2026-03-07T04:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.304770 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.304822 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.304840 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.304860 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.304874 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:09Z","lastTransitionTime":"2026-03-07T04:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.405709 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/0.log" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.406835 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.406868 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.406885 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.406903 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.406917 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:09Z","lastTransitionTime":"2026-03-07T04:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.410330 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a" exitCode=1 Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.410420 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a"} Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.411496 4689 scope.go:117] "RemoveContainer" containerID="84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.415856 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" event={"ID":"55c70eda-8745-4c02-93db-062597d2dbc8","Type":"ContainerStarted","Data":"ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd"} Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.433577 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.459654 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.481699 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.499425 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.511580 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.511661 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.511684 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.511709 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.511748 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:09Z","lastTransitionTime":"2026-03-07T04:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.521839 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.539875 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.554159 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.574252 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.600337 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Pod event handler 6\\\\nI0307 04:21:09.230584 6511 handler.go:208] Removed *v1.Node event handler 7\\\\nI0307 04:21:09.230594 6511 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0307 04:21:09.230593 6511 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:09.230618 6511 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 04:21:09.230634 6511 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:09.230639 6511 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0307 04:21:09.230785 6511 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:09.230859 6511 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:09.231569 6511 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0307 04:21:09.231604 6511 factory.go:656] Stopping watch factory\\\\nI0307 04:21:09.231625 6511 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.615575 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.615634 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.615649 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.615670 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.615684 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:09Z","lastTransitionTime":"2026-03-07T04:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.616374 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.635949 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.654406 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.671620 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.691527 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.702492 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.715591 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.719374 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.719416 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.719430 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.719453 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.719470 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:09Z","lastTransitionTime":"2026-03-07T04:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.728102 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.743512 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.757788 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.781756 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.795756 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.808079 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.820083 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.821968 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.822011 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.822022 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.822041 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.822051 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:09Z","lastTransitionTime":"2026-03-07T04:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.832043 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.844507 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.855678 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.867863 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.886563 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Pod event handler 6\\\\nI0307 04:21:09.230584 6511 handler.go:208] Removed *v1.Node event handler 7\\\\nI0307 04:21:09.230594 6511 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0307 04:21:09.230593 6511 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:09.230618 6511 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 04:21:09.230634 6511 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:09.230639 6511 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0307 04:21:09.230785 6511 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:09.230859 6511 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:09.231569 6511 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0307 04:21:09.231604 6511 factory.go:656] Stopping watch factory\\\\nI0307 04:21:09.231625 6511 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.901269 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.916289 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:09Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.925081 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.925136 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.925153 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.925195 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:09 crc kubenswrapper[4689]: I0307 04:21:09.925214 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:09Z","lastTransitionTime":"2026-03-07T04:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.027880 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.027927 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.027937 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.027956 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.027968 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:10Z","lastTransitionTime":"2026-03-07T04:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.130488 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.130522 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.130532 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.130546 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.130556 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:10Z","lastTransitionTime":"2026-03-07T04:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.233441 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.233492 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.233502 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.233523 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.233534 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:10Z","lastTransitionTime":"2026-03-07T04:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.335658 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.335704 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.335718 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.335737 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.335750 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:10Z","lastTransitionTime":"2026-03-07T04:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.423356 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/0.log" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.427625 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd"} Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.428338 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.438712 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.438762 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.438781 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.438804 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.438820 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:10Z","lastTransitionTime":"2026-03-07T04:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.457788 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.473548 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.494014 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.516208 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.528936 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.542141 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.542228 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.542247 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.542274 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.542295 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:10Z","lastTransitionTime":"2026-03-07T04:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.546867 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.568156 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.583908 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.602709 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.617143 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.632343 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.644731 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.644777 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.644790 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.644807 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.644820 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:10Z","lastTransitionTime":"2026-03-07T04:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.656750 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.670686 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.689880 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Pod event handler 6\\\\nI0307 04:21:09.230584 6511 handler.go:208] Removed *v1.Node event handler 7\\\\nI0307 04:21:09.230594 6511 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0307 04:21:09.230593 6511 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:09.230618 6511 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 04:21:09.230634 6511 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:09.230639 6511 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0307 04:21:09.230785 6511 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:09.230859 6511 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:09.231569 6511 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0307 04:21:09.231604 6511 factory.go:656] Stopping watch factory\\\\nI0307 04:21:09.231625 6511 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.702354 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:10Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.747851 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.747906 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.747918 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.747938 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.747951 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:10Z","lastTransitionTime":"2026-03-07T04:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.825865 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.825939 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.825980 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:10 crc kubenswrapper[4689]: E0307 04:21:10.826099 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.826485 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:10 crc kubenswrapper[4689]: E0307 04:21:10.826552 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:10 crc kubenswrapper[4689]: E0307 04:21:10.826614 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:10 crc kubenswrapper[4689]: E0307 04:21:10.826668 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.854120 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.854234 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.854256 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.854284 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.854315 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:10Z","lastTransitionTime":"2026-03-07T04:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.958130 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.958257 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.958276 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.958304 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:10 crc kubenswrapper[4689]: I0307 04:21:10.958319 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:10Z","lastTransitionTime":"2026-03-07T04:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.061053 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.061106 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.061120 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.061140 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.061155 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:11Z","lastTransitionTime":"2026-03-07T04:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.164822 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.164901 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.164918 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.164941 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.164956 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:11Z","lastTransitionTime":"2026-03-07T04:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.269735 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.269805 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.269822 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.269850 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.269873 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:11Z","lastTransitionTime":"2026-03-07T04:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.376604 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.376676 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.376697 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.376724 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.376743 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:11Z","lastTransitionTime":"2026-03-07T04:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.435487 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/1.log" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.436975 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/0.log" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.440651 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd" exitCode=1 Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.440726 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd"} Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.440788 4689 scope.go:117] "RemoveContainer" containerID="84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.441689 4689 scope.go:117] "RemoveContainer" containerID="2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd" Mar 07 04:21:11 crc kubenswrapper[4689]: E0307 04:21:11.441920 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.464755 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.480858 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.481364 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.481854 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.482095 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.482367 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:11Z","lastTransitionTime":"2026-03-07T04:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.483698 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.504929 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.531365 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84afee0b6ac496aada2ea9624fb2b695325218adad97d82316e795da43f23a0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Pod event handler 6\\\\nI0307 04:21:09.230584 6511 handler.go:208] Removed *v1.Node event handler 7\\\\nI0307 04:21:09.230594 6511 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0307 04:21:09.230593 6511 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:09.230618 6511 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 04:21:09.230634 6511 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:09.230639 6511 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0307 04:21:09.230785 6511 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:09.230859 6511 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:09.231569 6511 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0307 04:21:09.231604 6511 factory.go:656] Stopping watch factory\\\\nI0307 04:21:09.231625 6511 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:10Z\\\",\\\"message\\\":\\\"0.534759 6705 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534567 6705 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534887 6705 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.535058 6705 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.534731 6705 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.535465 6705 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:10.536014 6705 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.536330 6705 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 04:21:10.536401 6705 factory.go:656] Stopping watch factory\\\\nI0307 04:21:10.536421 6705 ovnkube.go:599] Stopped ovnkube\\\\nI0307 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.547840 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.566689 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.584059 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.586311 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.586357 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.586373 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.586399 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.586416 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:11Z","lastTransitionTime":"2026-03-07T04:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.601316 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.618254 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.631667 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.646343 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.665088 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.680358 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.689932 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.689999 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.690015 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.690047 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.690064 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:11Z","lastTransitionTime":"2026-03-07T04:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.697891 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.710989 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:11Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.794249 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.794325 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.794337 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.794361 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.794556 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:11Z","lastTransitionTime":"2026-03-07T04:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.898013 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.898095 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.898115 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.898147 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:11 crc kubenswrapper[4689]: I0307 04:21:11.898207 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:11Z","lastTransitionTime":"2026-03-07T04:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.002020 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.002117 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.002142 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.002223 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.002253 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:12Z","lastTransitionTime":"2026-03-07T04:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.104815 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.104880 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.104899 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.105064 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.105095 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:12Z","lastTransitionTime":"2026-03-07T04:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.209575 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.209653 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.209670 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.209702 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.209722 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:12Z","lastTransitionTime":"2026-03-07T04:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.314310 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.314377 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.314398 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.314426 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.314443 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:12Z","lastTransitionTime":"2026-03-07T04:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.418061 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.418116 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.418129 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.418149 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.418161 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:12Z","lastTransitionTime":"2026-03-07T04:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.451023 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/1.log" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.455031 4689 scope.go:117] "RemoveContainer" containerID="2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd" Mar 07 04:21:12 crc kubenswrapper[4689]: E0307 04:21:12.455214 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.473997 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.503361 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.522450 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.522489 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.522500 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.522522 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.522535 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:12Z","lastTransitionTime":"2026-03-07T04:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.523982 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.544400 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.578349 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.607250 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.625843 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.625904 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.625920 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.625975 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.625989 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:12Z","lastTransitionTime":"2026-03-07T04:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.634484 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.653535 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.673653 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.688148 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.701341 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.716137 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.728493 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.729320 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.729356 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.729368 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.729390 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.729403 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:12Z","lastTransitionTime":"2026-03-07T04:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.742508 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.761604 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:10Z\\\",\\\"message\\\":\\\"0.534759 6705 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534567 6705 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534887 6705 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.535058 6705 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.534731 6705 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.535465 6705 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:10.536014 6705 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.536330 6705 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 04:21:10.536401 6705 factory.go:656] Stopping watch factory\\\\nI0307 04:21:10.536421 6705 ovnkube.go:599] Stopped ovnkube\\\\nI0307 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:12Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.825973 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.826043 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.825989 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.826153 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:12 crc kubenswrapper[4689]: E0307 04:21:12.826279 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:12 crc kubenswrapper[4689]: E0307 04:21:12.826496 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:12 crc kubenswrapper[4689]: E0307 04:21:12.826698 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:12 crc kubenswrapper[4689]: E0307 04:21:12.826973 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.832328 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.832370 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.832383 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.832403 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.832426 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:12Z","lastTransitionTime":"2026-03-07T04:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.935508 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.935951 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.936020 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.936100 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.936187 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:12Z","lastTransitionTime":"2026-03-07T04:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:12 crc kubenswrapper[4689]: I0307 04:21:12.947315 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:12 crc kubenswrapper[4689]: E0307 04:21:12.947690 4689 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:12 crc kubenswrapper[4689]: E0307 04:21:12.947848 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs podName:16e0e2e8-673a-446e-b377-f30ffd8edd1f nodeName:}" failed. No retries permitted until 2026-03-07 04:21:20.947821642 +0000 UTC m=+125.994205131 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs") pod "network-metrics-daemon-95vzv" (UID: "16e0e2e8-673a-446e-b377-f30ffd8edd1f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.039918 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.039996 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.040025 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.040057 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.040076 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:13Z","lastTransitionTime":"2026-03-07T04:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.144128 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.144251 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.144272 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.144303 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.144324 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:13Z","lastTransitionTime":"2026-03-07T04:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.248389 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.248477 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.248508 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.248539 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.248563 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:13Z","lastTransitionTime":"2026-03-07T04:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.352238 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.352287 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.352307 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.352331 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.352350 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:13Z","lastTransitionTime":"2026-03-07T04:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.455371 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.455443 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.455464 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.455490 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.455509 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:13Z","lastTransitionTime":"2026-03-07T04:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.557939 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.558059 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.558079 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.558106 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.558141 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:13Z","lastTransitionTime":"2026-03-07T04:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.660679 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.660766 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.660785 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.660819 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.660840 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:13Z","lastTransitionTime":"2026-03-07T04:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.763848 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.763945 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.763970 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.764001 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.764020 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:13Z","lastTransitionTime":"2026-03-07T04:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.866983 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.867065 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.867085 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.867126 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.867144 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:13Z","lastTransitionTime":"2026-03-07T04:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.970631 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.970668 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.970676 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.970692 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:13 crc kubenswrapper[4689]: I0307 04:21:13.970704 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:13Z","lastTransitionTime":"2026-03-07T04:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.206097 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.206155 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.206189 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.206211 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.206223 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.209292 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.209347 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.209362 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.209382 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.209395 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.223421 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:14Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.227469 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.227521 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.227534 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.227554 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.227568 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.240620 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:14Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.245760 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.245824 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.245845 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.245881 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.245898 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.264397 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:14Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.269462 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.269497 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.269510 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.269529 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.269543 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.287231 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:14Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.295808 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.295862 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.295876 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.295895 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.295938 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.314443 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:14Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.314581 4689 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.316333 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.316375 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.316388 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.316407 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.316422 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.419382 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.419433 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.419451 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.419478 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.419496 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.521979 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.522064 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.522090 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.522124 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.522149 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.625611 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.625674 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.625697 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.625726 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.625749 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.666476 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.666634 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:21:30.666604066 +0000 UTC m=+135.712987595 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.666700 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.666765 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.666817 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.666865 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.666973 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.667003 4689 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.667012 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.667048 4689 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.667112 4689 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.667138 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:30.66711514 +0000 UTC m=+135.713498659 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.667276 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:30.667243493 +0000 UTC m=+135.713626982 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.667295 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:30.667286264 +0000 UTC m=+135.713669753 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.667287 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.667352 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.667378 4689 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.667497 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 04:21:30.667462299 +0000 UTC m=+135.713845968 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.728345 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.728397 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.728415 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.728443 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.728459 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.825961 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.826045 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.825990 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.825990 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.826299 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.826658 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.826803 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:14 crc kubenswrapper[4689]: E0307 04:21:14.826547 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.831147 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.831227 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.831242 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.831260 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.831274 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.934493 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.934563 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.934584 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.934615 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:14 crc kubenswrapper[4689]: I0307 04:21:14.934638 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:14Z","lastTransitionTime":"2026-03-07T04:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.037901 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.037972 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.037991 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.038020 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.038038 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:15Z","lastTransitionTime":"2026-03-07T04:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.140819 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.140881 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.140902 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.140934 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.140958 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:15Z","lastTransitionTime":"2026-03-07T04:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.244142 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.244223 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.244245 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.244276 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.244297 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:15Z","lastTransitionTime":"2026-03-07T04:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.347054 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.347092 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.347104 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.347120 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.347132 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:15Z","lastTransitionTime":"2026-03-07T04:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.450189 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.450223 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.450235 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.450254 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.450266 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:15Z","lastTransitionTime":"2026-03-07T04:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.553881 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.553962 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.553981 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.554014 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.554033 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:15Z","lastTransitionTime":"2026-03-07T04:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.657650 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.657734 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.657756 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.657785 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.657806 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:15Z","lastTransitionTime":"2026-03-07T04:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:15 crc kubenswrapper[4689]: E0307 04:21:15.758691 4689 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.854033 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:15Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.880444 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:15Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.898685 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:15Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.923392 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:15Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:15 crc kubenswrapper[4689]: E0307 04:21:15.933603 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.941492 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:15Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.962034 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:15Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.978708 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:15Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:15 crc kubenswrapper[4689]: I0307 04:21:15.991950 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:15Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:16 crc kubenswrapper[4689]: I0307 04:21:16.009929 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:16Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:16 crc kubenswrapper[4689]: I0307 04:21:16.026118 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:16Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:16 crc kubenswrapper[4689]: I0307 04:21:16.040701 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:16Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:16 crc kubenswrapper[4689]: I0307 04:21:16.064569 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:10Z\\\",\\\"message\\\":\\\"0.534759 6705 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534567 6705 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534887 6705 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.535058 6705 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.534731 6705 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.535465 6705 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:10.536014 6705 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.536330 6705 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 04:21:10.536401 6705 factory.go:656] Stopping watch factory\\\\nI0307 04:21:10.536421 6705 ovnkube.go:599] Stopped ovnkube\\\\nI0307 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:16Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:16 crc kubenswrapper[4689]: I0307 04:21:16.086109 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:16Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:16 crc kubenswrapper[4689]: I0307 04:21:16.104929 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:16Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:16 crc kubenswrapper[4689]: I0307 04:21:16.128257 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:16Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:16 crc kubenswrapper[4689]: I0307 04:21:16.825007 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:16 crc kubenswrapper[4689]: I0307 04:21:16.825081 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:16 crc kubenswrapper[4689]: I0307 04:21:16.825155 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:16 crc kubenswrapper[4689]: E0307 04:21:16.825248 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:16 crc kubenswrapper[4689]: I0307 04:21:16.825111 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:16 crc kubenswrapper[4689]: E0307 04:21:16.825401 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:16 crc kubenswrapper[4689]: E0307 04:21:16.825492 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:16 crc kubenswrapper[4689]: E0307 04:21:16.825585 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:18 crc kubenswrapper[4689]: I0307 04:21:18.825698 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:18 crc kubenswrapper[4689]: E0307 04:21:18.825892 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:18 crc kubenswrapper[4689]: I0307 04:21:18.826006 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:18 crc kubenswrapper[4689]: I0307 04:21:18.826051 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:18 crc kubenswrapper[4689]: I0307 04:21:18.826211 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:18 crc kubenswrapper[4689]: E0307 04:21:18.826261 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:18 crc kubenswrapper[4689]: E0307 04:21:18.826397 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:18 crc kubenswrapper[4689]: E0307 04:21:18.826583 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:20 crc kubenswrapper[4689]: I0307 04:21:20.825854 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:20 crc kubenswrapper[4689]: I0307 04:21:20.825916 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:20 crc kubenswrapper[4689]: I0307 04:21:20.825952 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:20 crc kubenswrapper[4689]: E0307 04:21:20.826119 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:20 crc kubenswrapper[4689]: I0307 04:21:20.826159 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:20 crc kubenswrapper[4689]: E0307 04:21:20.826420 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:20 crc kubenswrapper[4689]: E0307 04:21:20.826633 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:20 crc kubenswrapper[4689]: E0307 04:21:20.826724 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:20 crc kubenswrapper[4689]: E0307 04:21:20.935611 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:21:20 crc kubenswrapper[4689]: I0307 04:21:20.950605 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:20 crc kubenswrapper[4689]: E0307 04:21:20.950855 4689 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:20 crc kubenswrapper[4689]: E0307 04:21:20.950979 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs podName:16e0e2e8-673a-446e-b377-f30ffd8edd1f nodeName:}" failed. No retries permitted until 2026-03-07 04:21:36.950953649 +0000 UTC m=+141.997337198 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs") pod "network-metrics-daemon-95vzv" (UID: "16e0e2e8-673a-446e-b377-f30ffd8edd1f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.407398 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.429997 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.452930 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.473095 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.486480 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.506012 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.527951 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.542485 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.563924 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.593127 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:10Z\\\",\\\"message\\\":\\\"0.534759 6705 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534567 6705 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534887 6705 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.535058 6705 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.534731 6705 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.535465 6705 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:10.536014 6705 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.536330 6705 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 04:21:10.536401 6705 factory.go:656] Stopping watch factory\\\\nI0307 04:21:10.536421 6705 ovnkube.go:599] Stopped ovnkube\\\\nI0307 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.613285 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.637016 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.656677 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.674840 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.699821 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:21 crc kubenswrapper[4689]: I0307 04:21:21.714978 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:21Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:22 crc kubenswrapper[4689]: I0307 04:21:22.824967 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:22 crc kubenswrapper[4689]: I0307 04:21:22.825037 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:22 crc kubenswrapper[4689]: I0307 04:21:22.824967 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:22 crc kubenswrapper[4689]: E0307 04:21:22.825153 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:22 crc kubenswrapper[4689]: I0307 04:21:22.824967 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:22 crc kubenswrapper[4689]: E0307 04:21:22.825340 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:22 crc kubenswrapper[4689]: E0307 04:21:22.825455 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:22 crc kubenswrapper[4689]: E0307 04:21:22.825709 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:23 crc kubenswrapper[4689]: I0307 04:21:23.827000 4689 scope.go:117] "RemoveContainer" containerID="2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd" Mar 07 04:21:23 crc kubenswrapper[4689]: I0307 04:21:23.841218 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.503068 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/1.log" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.507738 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d"} Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.530402 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.549045 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.565373 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.574833 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.574870 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.574882 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.574899 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.574911 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:24Z","lastTransitionTime":"2026-03-07T04:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.593432 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: E0307 04:21:24.598808 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.602074 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.602108 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.602118 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.602136 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.602150 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:24Z","lastTransitionTime":"2026-03-07T04:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.613506 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: E0307 04:21:24.617844 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.621276 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.621314 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.621327 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.621346 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.621356 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:24Z","lastTransitionTime":"2026-03-07T04:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.626365 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: E0307 04:21:24.633478 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.636609 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.636650 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.636664 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.636683 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.636697 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:24Z","lastTransitionTime":"2026-03-07T04:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.644283 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:10Z\\\",\\\"message\\\":\\\"0.534759 6705 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534567 6705 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534887 6705 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.535058 6705 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.534731 6705 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.535465 6705 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:10.536014 6705 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.536330 6705 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 04:21:10.536401 6705 factory.go:656] Stopping watch factory\\\\nI0307 04:21:10.536421 6705 ovnkube.go:599] Stopped ovnkube\\\\nI0307 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: E0307 04:21:24.649051 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.651968 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.652026 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.652042 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.652060 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.652072 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:24Z","lastTransitionTime":"2026-03-07T04:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.657562 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: E0307 04:21:24.662390 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: E0307 04:21:24.662550 4689 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.666219 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.676840 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.685609 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.695316 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.707564 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.715190 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.733164 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.750988 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:24Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.825423 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.825504 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.825423 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:24 crc kubenswrapper[4689]: E0307 04:21:24.825566 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:24 crc kubenswrapper[4689]: I0307 04:21:24.825645 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:24 crc kubenswrapper[4689]: E0307 04:21:24.825678 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:24 crc kubenswrapper[4689]: E0307 04:21:24.825755 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:24 crc kubenswrapper[4689]: E0307 04:21:24.825816 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.515294 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/2.log" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.516037 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/1.log" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.519311 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d" exitCode=1 Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.519376 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d"} Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.519490 4689 scope.go:117] "RemoveContainer" containerID="2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.520822 4689 scope.go:117] "RemoveContainer" containerID="2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d" Mar 07 04:21:25 crc kubenswrapper[4689]: E0307 04:21:25.521142 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.534275 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.550967 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.567828 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.587427 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.604555 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.621398 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.635884 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.655786 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.672895 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.689631 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.704558 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.723001 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.742815 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.765988 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:10Z\\\",\\\"message\\\":\\\"0.534759 6705 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534567 6705 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534887 6705 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.535058 6705 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.534731 6705 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.535465 6705 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:10.536014 6705 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.536330 6705 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 04:21:10.536401 6705 factory.go:656] Stopping watch factory\\\\nI0307 04:21:10.536421 6705 ovnkube.go:599] Stopped ovnkube\\\\nI0307 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"rom k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827373 6871 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:24.827561 6871 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827836 6871 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827920 6871 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827965 6871 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.828272 6871 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:24.828410 6871 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 04:21:24.828526 6871 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 04:21:24.828582 6871 factory.go:656] Stopping watch factory\\\\nI0307 04:21:24.828607 6871 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 04:21:24.828619 6871 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.784716 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.800506 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.841404 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.856775 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.869546 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.890834 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a89f6605d2be1ae7af0e3b7b963a50d8c418229a7842d1e03d90b9b926869dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:10Z\\\",\\\"message\\\":\\\"0.534759 6705 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534567 6705 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.534887 6705 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.535058 6705 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:10.534731 6705 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.535465 6705 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:10.536014 6705 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:10.536330 6705 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 04:21:10.536401 6705 factory.go:656] Stopping watch factory\\\\nI0307 04:21:10.536421 6705 ovnkube.go:599] Stopped ovnkube\\\\nI0307 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"rom k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827373 6871 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:24.827561 6871 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827836 6871 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827920 6871 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827965 6871 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.828272 6871 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:24.828410 6871 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 04:21:24.828526 6871 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 04:21:24.828582 6871 factory.go:656] Stopping watch factory\\\\nI0307 04:21:24.828607 6871 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 04:21:24.828619 6871 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.911892 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.929682 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: E0307 04:21:25.936314 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.948147 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.962824 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.976535 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:25 crc kubenswrapper[4689]: I0307 04:21:25.987903 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:25Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:26 crc kubenswrapper[4689]: I0307 04:21:26.009724 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:26Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:26 crc kubenswrapper[4689]: I0307 04:21:26.073966 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:26Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:26 crc kubenswrapper[4689]: I0307 04:21:26.095672 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:26Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:26 crc kubenswrapper[4689]: I0307 04:21:26.109486 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:26Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:26 crc kubenswrapper[4689]: I0307 04:21:26.122560 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:26Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:26 crc kubenswrapper[4689]: I0307 04:21:26.135945 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:26Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:26 crc kubenswrapper[4689]: I0307 04:21:26.524332 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/2.log" Mar 07 04:21:26 crc kubenswrapper[4689]: I0307 04:21:26.825243 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:26 crc kubenswrapper[4689]: I0307 04:21:26.825321 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:26 crc kubenswrapper[4689]: I0307 04:21:26.825243 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:26 crc kubenswrapper[4689]: I0307 04:21:26.825387 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:26 crc kubenswrapper[4689]: E0307 04:21:26.825445 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:26 crc kubenswrapper[4689]: E0307 04:21:26.825589 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:26 crc kubenswrapper[4689]: E0307 04:21:26.825699 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:26 crc kubenswrapper[4689]: E0307 04:21:26.825835 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:28 crc kubenswrapper[4689]: I0307 04:21:28.825032 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:28 crc kubenswrapper[4689]: I0307 04:21:28.825032 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:28 crc kubenswrapper[4689]: I0307 04:21:28.825123 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:28 crc kubenswrapper[4689]: I0307 04:21:28.825093 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:28 crc kubenswrapper[4689]: E0307 04:21:28.825267 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:28 crc kubenswrapper[4689]: E0307 04:21:28.825439 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:28 crc kubenswrapper[4689]: E0307 04:21:28.825522 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:28 crc kubenswrapper[4689]: E0307 04:21:28.825602 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.285745 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.286887 4689 scope.go:117] "RemoveContainer" containerID="2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d" Mar 07 04:21:29 crc kubenswrapper[4689]: E0307 04:21:29.287235 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.307470 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.329554 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.347937 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.371771 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.390017 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.407319 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.427791 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.448591 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.462717 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.480369 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.496352 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.512504 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.529668 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.557511 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"rom k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827373 6871 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:24.827561 6871 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827836 6871 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827920 6871 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827965 6871 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.828272 6871 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:24.828410 6871 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 04:21:24.828526 6871 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 04:21:24.828582 6871 factory.go:656] Stopping watch factory\\\\nI0307 04:21:24.828607 6871 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 04:21:24.828619 6871 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.580061 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:29 crc kubenswrapper[4689]: I0307 04:21:29.596969 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:29Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:30 crc kubenswrapper[4689]: I0307 04:21:30.765464 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:21:30 crc kubenswrapper[4689]: I0307 04:21:30.765624 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.765680 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:22:02.765644362 +0000 UTC m=+167.812027871 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:21:30 crc kubenswrapper[4689]: I0307 04:21:30.765754 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:30 crc kubenswrapper[4689]: I0307 04:21:30.765809 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.765826 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.765865 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:21:30 crc kubenswrapper[4689]: I0307 04:21:30.765874 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.765885 4689 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.765976 4689 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.766020 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 04:22:02.766008743 +0000 UTC m=+167.812392242 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.765933 4689 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.766044 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.766080 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:22:02.766049634 +0000 UTC m=+167.812433163 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.766088 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.766117 4689 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.766120 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:22:02.766100746 +0000 UTC m=+167.812484315 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.766233 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 04:22:02.766161038 +0000 UTC m=+167.812544567 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:21:30 crc kubenswrapper[4689]: I0307 04:21:30.825540 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:30 crc kubenswrapper[4689]: I0307 04:21:30.825621 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:30 crc kubenswrapper[4689]: I0307 04:21:30.825624 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:30 crc kubenswrapper[4689]: I0307 04:21:30.825624 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.825737 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.825866 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.826272 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.826337 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:30 crc kubenswrapper[4689]: I0307 04:21:30.844983 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 07 04:21:30 crc kubenswrapper[4689]: E0307 04:21:30.938415 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:21:32 crc kubenswrapper[4689]: I0307 04:21:32.825824 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:32 crc kubenswrapper[4689]: I0307 04:21:32.825888 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:32 crc kubenswrapper[4689]: E0307 04:21:32.826043 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:32 crc kubenswrapper[4689]: E0307 04:21:32.826235 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:32 crc kubenswrapper[4689]: I0307 04:21:32.826363 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:32 crc kubenswrapper[4689]: E0307 04:21:32.826489 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:32 crc kubenswrapper[4689]: I0307 04:21:32.826382 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:32 crc kubenswrapper[4689]: E0307 04:21:32.826607 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.824927 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.825004 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.825093 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:34 crc kubenswrapper[4689]: E0307 04:21:34.825118 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.825142 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:34 crc kubenswrapper[4689]: E0307 04:21:34.825431 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:34 crc kubenswrapper[4689]: E0307 04:21:34.825578 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:34 crc kubenswrapper[4689]: E0307 04:21:34.825703 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.964830 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.964904 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.964934 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.964969 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.964991 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:34Z","lastTransitionTime":"2026-03-07T04:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:34 crc kubenswrapper[4689]: E0307 04:21:34.987386 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:34Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.993608 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.993688 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.993707 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.993737 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:34 crc kubenswrapper[4689]: I0307 04:21:34.993751 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:34Z","lastTransitionTime":"2026-03-07T04:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:35 crc kubenswrapper[4689]: E0307 04:21:35.021472 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:35Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.027382 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.027505 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.027533 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.027561 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.027579 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:35Z","lastTransitionTime":"2026-03-07T04:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:35 crc kubenswrapper[4689]: E0307 04:21:35.050570 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:35Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.056890 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.056947 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.056967 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.056992 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.057013 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:35Z","lastTransitionTime":"2026-03-07T04:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:35 crc kubenswrapper[4689]: E0307 04:21:35.079545 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:35Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.089550 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.089625 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.089646 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.089675 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.089708 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:35Z","lastTransitionTime":"2026-03-07T04:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:35 crc kubenswrapper[4689]: E0307 04:21:35.114939 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:35Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:35 crc kubenswrapper[4689]: E0307 04:21:35.115237 4689 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.851233 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:35Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.889948 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"rom k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827373 6871 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:24.827561 6871 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827836 6871 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827920 6871 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827965 6871 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.828272 6871 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:24.828410 6871 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 04:21:24.828526 6871 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 04:21:24.828582 6871 factory.go:656] Stopping watch factory\\\\nI0307 04:21:24.828607 6871 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 04:21:24.828619 6871 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:35Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.911734 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55122838-d44d-4686-a0ba-93e8f85122e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2c10b5babd421667e41329cf7d752810507842c003e9b0e24c07c59e3e866b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2145120e262bbc6fd4876167d2bc0bd4f23ca467a1ab81f57e8df919c721c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dcba7bfbb1a5097afa4c8643d0fdc845439b5107877e8689daed2072d34e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:35Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.937960 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:35Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:35 crc kubenswrapper[4689]: E0307 04:21:35.939276 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.961351 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:35Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:35 crc kubenswrapper[4689]: I0307 04:21:35.984068 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:35Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.004354 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:36Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.027025 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:36Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.052530 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:36Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.070047 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:36Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.093482 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:36Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.117104 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:36Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.135041 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:36Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.158152 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:36Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.173902 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:36Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.196375 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:36Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.216849 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:36Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.824923 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.825012 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:36 crc kubenswrapper[4689]: E0307 04:21:36.825086 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.825149 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:36 crc kubenswrapper[4689]: I0307 04:21:36.825221 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:36 crc kubenswrapper[4689]: E0307 04:21:36.825334 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:36 crc kubenswrapper[4689]: E0307 04:21:36.825666 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:36 crc kubenswrapper[4689]: E0307 04:21:36.825778 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:37 crc kubenswrapper[4689]: I0307 04:21:37.038907 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:37 crc kubenswrapper[4689]: E0307 04:21:37.039281 4689 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:37 crc kubenswrapper[4689]: E0307 04:21:37.039399 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs podName:16e0e2e8-673a-446e-b377-f30ffd8edd1f nodeName:}" failed. No retries permitted until 2026-03-07 04:22:09.039367197 +0000 UTC m=+174.085750726 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs") pod "network-metrics-daemon-95vzv" (UID: "16e0e2e8-673a-446e-b377-f30ffd8edd1f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:21:38 crc kubenswrapper[4689]: I0307 04:21:38.825329 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:38 crc kubenswrapper[4689]: I0307 04:21:38.825406 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:38 crc kubenswrapper[4689]: I0307 04:21:38.825392 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:38 crc kubenswrapper[4689]: E0307 04:21:38.825539 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:38 crc kubenswrapper[4689]: I0307 04:21:38.825557 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:38 crc kubenswrapper[4689]: E0307 04:21:38.825663 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:38 crc kubenswrapper[4689]: E0307 04:21:38.825802 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:38 crc kubenswrapper[4689]: E0307 04:21:38.825943 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:40 crc kubenswrapper[4689]: I0307 04:21:40.825126 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:40 crc kubenswrapper[4689]: I0307 04:21:40.825233 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:40 crc kubenswrapper[4689]: I0307 04:21:40.825270 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:40 crc kubenswrapper[4689]: E0307 04:21:40.826447 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:40 crc kubenswrapper[4689]: E0307 04:21:40.826519 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:40 crc kubenswrapper[4689]: I0307 04:21:40.825389 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:40 crc kubenswrapper[4689]: E0307 04:21:40.826640 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:40 crc kubenswrapper[4689]: E0307 04:21:40.826411 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:40 crc kubenswrapper[4689]: E0307 04:21:40.940740 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:21:42 crc kubenswrapper[4689]: I0307 04:21:42.825230 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:42 crc kubenswrapper[4689]: I0307 04:21:42.825241 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:42 crc kubenswrapper[4689]: I0307 04:21:42.825323 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:42 crc kubenswrapper[4689]: I0307 04:21:42.825463 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:42 crc kubenswrapper[4689]: E0307 04:21:42.825685 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:42 crc kubenswrapper[4689]: E0307 04:21:42.825846 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:42 crc kubenswrapper[4689]: E0307 04:21:42.825925 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:42 crc kubenswrapper[4689]: E0307 04:21:42.826085 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:43 crc kubenswrapper[4689]: I0307 04:21:43.826718 4689 scope.go:117] "RemoveContainer" containerID="2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d" Mar 07 04:21:43 crc kubenswrapper[4689]: E0307 04:21:43.827905 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" Mar 07 04:21:43 crc kubenswrapper[4689]: I0307 04:21:43.841668 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 07 04:21:44 crc kubenswrapper[4689]: I0307 04:21:44.825899 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:44 crc kubenswrapper[4689]: I0307 04:21:44.826007 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:44 crc kubenswrapper[4689]: I0307 04:21:44.826049 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:44 crc kubenswrapper[4689]: I0307 04:21:44.825932 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:44 crc kubenswrapper[4689]: E0307 04:21:44.826077 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:44 crc kubenswrapper[4689]: E0307 04:21:44.826315 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:44 crc kubenswrapper[4689]: E0307 04:21:44.826469 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:44 crc kubenswrapper[4689]: E0307 04:21:44.826683 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.366814 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.366853 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.366863 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.366883 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.366895 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:45Z","lastTransitionTime":"2026-03-07T04:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:45 crc kubenswrapper[4689]: E0307 04:21:45.385596 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:45Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.390371 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.390448 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.390466 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.390489 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.390507 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:45Z","lastTransitionTime":"2026-03-07T04:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:45 crc kubenswrapper[4689]: E0307 04:21:45.408998 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:45Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.413904 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.413972 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.413991 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.414016 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.414033 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:45Z","lastTransitionTime":"2026-03-07T04:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:45 crc kubenswrapper[4689]: E0307 04:21:45.434075 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:45Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.439399 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.439441 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.439452 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.439472 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.439483 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:45Z","lastTransitionTime":"2026-03-07T04:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:45 crc kubenswrapper[4689]: E0307 04:21:45.453479 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:45Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.457924 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.457994 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.458009 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.458030 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.458044 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:45Z","lastTransitionTime":"2026-03-07T04:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:45 crc kubenswrapper[4689]: E0307 04:21:45.475486 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:45Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:45 crc kubenswrapper[4689]: E0307 04:21:45.475742 4689 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.850529 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:45Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.868080 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:45Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.889736 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:45Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.924617 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"rom k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827373 6871 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:24.827561 6871 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827836 6871 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827920 6871 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827965 6871 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.828272 6871 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:24.828410 6871 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 04:21:24.828526 6871 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 04:21:24.828582 6871 factory.go:656] Stopping watch factory\\\\nI0307 04:21:24.828607 6871 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 04:21:24.828619 6871 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:45Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:45 crc kubenswrapper[4689]: E0307 04:21:45.941759 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.941996 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55122838-d44d-4686-a0ba-93e8f85122e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2c10b5babd421667e41329cf7d752810507842c003e9b0e24c07c59e3e866b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2145120e262bbc6fd4876167d2bc0bd4f23ca467a1ab81f57e8df919c721c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dcba7bfbb1a5097afa4c8643d0fdc845439b5107877e8689daed2072d34e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:45Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.968583 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:45Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:45 crc kubenswrapper[4689]: I0307 04:21:45.993035 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:45Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.006998 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9645983-f81a-4162-8ebc-49497cff7b37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742bbed66a04e7b4f7db672f2ed05a32c1592384661edebf72f01e9b1b7d0eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:46Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.022919 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:46Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.039401 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:46Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.058479 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:46Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.071873 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:46Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.089660 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:46Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.112301 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:46Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.129325 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:46Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.143529 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:46Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.160954 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:46Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.179561 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:46Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.825666 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:46 crc kubenswrapper[4689]: E0307 04:21:46.825883 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.826127 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:46 crc kubenswrapper[4689]: E0307 04:21:46.826426 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.826478 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:46 crc kubenswrapper[4689]: I0307 04:21:46.826561 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:46 crc kubenswrapper[4689]: E0307 04:21:46.826601 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:46 crc kubenswrapper[4689]: E0307 04:21:46.826741 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:48 crc kubenswrapper[4689]: I0307 04:21:48.826140 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:48 crc kubenswrapper[4689]: I0307 04:21:48.826227 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:48 crc kubenswrapper[4689]: I0307 04:21:48.826318 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:48 crc kubenswrapper[4689]: I0307 04:21:48.826342 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:48 crc kubenswrapper[4689]: E0307 04:21:48.826428 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:48 crc kubenswrapper[4689]: E0307 04:21:48.826545 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:48 crc kubenswrapper[4689]: E0307 04:21:48.826748 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:48 crc kubenswrapper[4689]: E0307 04:21:48.826914 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.628766 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wmhqx_5508b217-e634-41a8-813a-65ae39d7ea3d/kube-multus/0.log" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.628840 4689 generic.go:334] "Generic (PLEG): container finished" podID="5508b217-e634-41a8-813a-65ae39d7ea3d" containerID="733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd" exitCode=1 Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.628880 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wmhqx" event={"ID":"5508b217-e634-41a8-813a-65ae39d7ea3d","Type":"ContainerDied","Data":"733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd"} Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.629435 4689 scope.go:117] "RemoveContainer" containerID="733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.652333 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.672600 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.692065 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.714229 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.734782 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.755047 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:49Z\\\",\\\"message\\\":\\\"2026-03-07T04:21:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95\\\\n2026-03-07T04:21:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95 to /host/opt/cni/bin/\\\\n2026-03-07T04:21:04Z [verbose] multus-daemon started\\\\n2026-03-07T04:21:04Z [verbose] Readiness Indicator file check\\\\n2026-03-07T04:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.786873 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"rom k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827373 6871 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:24.827561 6871 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827836 6871 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827920 6871 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827965 6871 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.828272 6871 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:24.828410 6871 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 04:21:24.828526 6871 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 04:21:24.828582 6871 factory.go:656] Stopping watch factory\\\\nI0307 04:21:24.828607 6871 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 04:21:24.828619 6871 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.809870 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55122838-d44d-4686-a0ba-93e8f85122e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2c10b5babd421667e41329cf7d752810507842c003e9b0e24c07c59e3e866b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2145120e262bbc6fd4876167d2bc0bd4f23ca467a1ab81f57e8df919c721c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dcba7bfbb1a5097afa4c8643d0fdc845439b5107877e8689daed2072d34e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.832687 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.850867 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.871246 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.886785 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.913044 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.937325 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.952662 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.973595 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:49 crc kubenswrapper[4689]: I0307 04:21:49.988409 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9645983-f81a-4162-8ebc-49497cff7b37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742bbed66a04e7b4f7db672f2ed05a32c1592384661edebf72f01e9b1b7d0eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:49Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.004355 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.636912 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wmhqx_5508b217-e634-41a8-813a-65ae39d7ea3d/kube-multus/0.log" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.637003 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wmhqx" event={"ID":"5508b217-e634-41a8-813a-65ae39d7ea3d","Type":"ContainerStarted","Data":"4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5"} Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.662484 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.680575 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.703993 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.721992 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9645983-f81a-4162-8ebc-49497cff7b37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742bbed66a04e7b4f7db672f2ed05a32c1592384661edebf72f01e9b1b7d0eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.744727 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.761606 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.777552 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.792223 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.807699 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.820776 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.825144 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.825158 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.825158 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.825220 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:50 crc kubenswrapper[4689]: E0307 04:21:50.825418 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:50 crc kubenswrapper[4689]: E0307 04:21:50.825613 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:50 crc kubenswrapper[4689]: E0307 04:21:50.825666 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:50 crc kubenswrapper[4689]: E0307 04:21:50.825724 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.833268 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.846864 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.856744 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.866928 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55122838-d44d-4686-a0ba-93e8f85122e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2c10b5babd421667e41329cf7d752810507842c003e9b0e24c07c59e3e866b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2145120e262bbc6fd4876167d2bc0bd4f23ca467a1ab81f57e8df919c721c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dcba7bfbb1a5097afa4c8643d0fdc845439b5107877e8689daed2072d34e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.885365 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.900512 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.921961 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:49Z\\\",\\\"message\\\":\\\"2026-03-07T04:21:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95\\\\n2026-03-07T04:21:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95 to /host/opt/cni/bin/\\\\n2026-03-07T04:21:04Z [verbose] multus-daemon started\\\\n2026-03-07T04:21:04Z [verbose] Readiness Indicator file check\\\\n2026-03-07T04:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:50 crc kubenswrapper[4689]: E0307 04:21:50.943790 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:21:50 crc kubenswrapper[4689]: I0307 04:21:50.953909 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"rom k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827373 6871 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:24.827561 6871 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827836 6871 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827920 6871 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827965 6871 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.828272 6871 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:24.828410 6871 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 04:21:24.828526 6871 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 04:21:24.828582 6871 factory.go:656] Stopping watch factory\\\\nI0307 04:21:24.828607 6871 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 04:21:24.828619 6871 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:50Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:52 crc kubenswrapper[4689]: I0307 04:21:52.825985 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:52 crc kubenswrapper[4689]: I0307 04:21:52.826064 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:52 crc kubenswrapper[4689]: I0307 04:21:52.826064 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:52 crc kubenswrapper[4689]: I0307 04:21:52.826204 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:52 crc kubenswrapper[4689]: E0307 04:21:52.826350 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:52 crc kubenswrapper[4689]: E0307 04:21:52.826521 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:52 crc kubenswrapper[4689]: E0307 04:21:52.826622 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:52 crc kubenswrapper[4689]: E0307 04:21:52.826823 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:54 crc kubenswrapper[4689]: I0307 04:21:54.825019 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:54 crc kubenswrapper[4689]: E0307 04:21:54.825240 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:54 crc kubenswrapper[4689]: I0307 04:21:54.825504 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:54 crc kubenswrapper[4689]: E0307 04:21:54.825597 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:54 crc kubenswrapper[4689]: I0307 04:21:54.825774 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:54 crc kubenswrapper[4689]: E0307 04:21:54.825852 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:54 crc kubenswrapper[4689]: I0307 04:21:54.826033 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:54 crc kubenswrapper[4689]: E0307 04:21:54.826124 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.692036 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.692099 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.692120 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.692149 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.692215 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:55Z","lastTransitionTime":"2026-03-07T04:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:55 crc kubenswrapper[4689]: E0307 04:21:55.707993 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.712284 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.716618 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.716639 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.716672 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.716686 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:55Z","lastTransitionTime":"2026-03-07T04:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:55 crc kubenswrapper[4689]: E0307 04:21:55.734743 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.739429 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.739459 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.739469 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.739483 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.739493 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:55Z","lastTransitionTime":"2026-03-07T04:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:55 crc kubenswrapper[4689]: E0307 04:21:55.752646 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.756551 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.756607 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.756624 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.756655 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.756682 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:55Z","lastTransitionTime":"2026-03-07T04:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:55 crc kubenswrapper[4689]: E0307 04:21:55.769939 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.774677 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.774721 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.774734 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.774751 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.774764 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:21:55Z","lastTransitionTime":"2026-03-07T04:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:21:55 crc kubenswrapper[4689]: E0307 04:21:55.791646 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: E0307 04:21:55.792247 4689 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.839088 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.853367 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.868557 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9645983-f81a-4162-8ebc-49497cff7b37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742bbed66a04e7b4f7db672f2ed05a32c1592384661edebf72f01e9b1b7d0eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.886823 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.907285 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.929362 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: E0307 04:21:55.944703 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.945067 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.967820 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:55 crc kubenswrapper[4689]: I0307 04:21:55.983051 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:55Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.003482 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:56Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.022803 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:56Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.038685 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:56Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.053623 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:56Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.070827 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:56Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.088234 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:56Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.106278 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:49Z\\\",\\\"message\\\":\\\"2026-03-07T04:21:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95\\\\n2026-03-07T04:21:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95 to /host/opt/cni/bin/\\\\n2026-03-07T04:21:04Z [verbose] multus-daemon started\\\\n2026-03-07T04:21:04Z [verbose] Readiness Indicator file check\\\\n2026-03-07T04:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:56Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.137952 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"rom k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827373 6871 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:24.827561 6871 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827836 6871 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827920 6871 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827965 6871 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.828272 6871 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:24.828410 6871 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 04:21:24.828526 6871 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 04:21:24.828582 6871 factory.go:656] Stopping watch factory\\\\nI0307 04:21:24.828607 6871 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 04:21:24.828619 6871 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:56Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.150017 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55122838-d44d-4686-a0ba-93e8f85122e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2c10b5babd421667e41329cf7d752810507842c003e9b0e24c07c59e3e866b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2145120e262bbc6fd4876167d2bc0bd4f23ca467a1ab81f57e8df919c721c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dcba7bfbb1a5097afa4c8643d0fdc845439b5107877e8689daed2072d34e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:56Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.825590 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.825621 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.825590 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.826352 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:56 crc kubenswrapper[4689]: E0307 04:21:56.826827 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:56 crc kubenswrapper[4689]: E0307 04:21:56.827140 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:56 crc kubenswrapper[4689]: I0307 04:21:56.827250 4689 scope.go:117] "RemoveContainer" containerID="2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d" Mar 07 04:21:56 crc kubenswrapper[4689]: E0307 04:21:56.827343 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:56 crc kubenswrapper[4689]: E0307 04:21:56.827796 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.665405 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/2.log" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.668979 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a"} Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.669549 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.688848 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.704085 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.718500 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.733627 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.745970 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.761690 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55122838-d44d-4686-a0ba-93e8f85122e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2c10b5babd421667e41329cf7d752810507842c003e9b0e24c07c59e3e866b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2145120e262bbc6fd4876167d2bc0bd4f23ca467a1ab81f57e8df919c721c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dcba7bfbb1a5097afa4c8643d0fdc845439b5107877e8689daed2072d34e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.780782 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.794966 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.812980 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:49Z\\\",\\\"message\\\":\\\"2026-03-07T04:21:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95\\\\n2026-03-07T04:21:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95 to /host/opt/cni/bin/\\\\n2026-03-07T04:21:04Z [verbose] multus-daemon started\\\\n2026-03-07T04:21:04Z [verbose] Readiness Indicator file check\\\\n2026-03-07T04:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.840386 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"rom k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827373 6871 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:24.827561 6871 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827836 6871 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827920 6871 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827965 6871 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.828272 6871 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:24.828410 6871 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 04:21:24.828526 6871 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 04:21:24.828582 6871 factory.go:656] Stopping watch factory\\\\nI0307 04:21:24.828607 6871 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 04:21:24.828619 6871 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.859067 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.871934 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.893096 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.908655 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9645983-f81a-4162-8ebc-49497cff7b37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742bbed66a04e7b4f7db672f2ed05a32c1592384661edebf72f01e9b1b7d0eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.924423 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.941652 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.960364 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:57 crc kubenswrapper[4689]: I0307 04:21:57.973682 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.677557 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/3.log" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.678855 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/2.log" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.683656 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a" exitCode=1 Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.683728 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a"} Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.683805 4689 scope.go:117] "RemoveContainer" containerID="2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.685229 4689 scope.go:117] "RemoveContainer" containerID="3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a" Mar 07 04:21:58 crc kubenswrapper[4689]: E0307 04:21:58.685600 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.718991 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55122838-d44d-4686-a0ba-93e8f85122e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2c10b5babd421667e41329cf7d752810507842c003e9b0e24c07c59e3e866b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2145120e262bbc6fd4876167d2bc0bd4f23ca467a1ab81f57e8df919c721c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dcba7bfbb1a5097afa4c8643d0fdc845439b5107877e8689daed2072d34e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:58Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.741977 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:58Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.759898 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:58Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.786691 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:49Z\\\",\\\"message\\\":\\\"2026-03-07T04:21:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95\\\\n2026-03-07T04:21:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95 to /host/opt/cni/bin/\\\\n2026-03-07T04:21:04Z [verbose] multus-daemon started\\\\n2026-03-07T04:21:04Z [verbose] Readiness Indicator file check\\\\n2026-03-07T04:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:58Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.819588 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e5e2578bd5eb753d2d4c38da368cd0b98842f9e94401fa836a0fa3b486dbf4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:24Z\\\",\\\"message\\\":\\\"rom k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827373 6871 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 04:21:24.827561 6871 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827836 6871 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827920 6871 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.827965 6871 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 04:21:24.828272 6871 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 04:21:24.828410 6871 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 04:21:24.828526 6871 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 04:21:24.828582 6871 factory.go:656] Stopping watch factory\\\\nI0307 04:21:24.828607 6871 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 04:21:24.828619 6871 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:57Z\\\",\\\"message\\\":\\\"tart default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z]\\\\nI0307 04:21:57.834479 7154 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-scheduler-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"1dc899db-4498-4b7a-8437-861940b962e7\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:58Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.824948 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.824984 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.825013 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:21:58 crc kubenswrapper[4689]: E0307 04:21:58.825652 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:21:58 crc kubenswrapper[4689]: E0307 04:21:58.825394 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:21:58 crc kubenswrapper[4689]: E0307 04:21:58.825823 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.825093 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:21:58 crc kubenswrapper[4689]: E0307 04:21:58.826012 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.845089 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:58Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.864711 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:58Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.892053 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:58Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.911086 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9645983-f81a-4162-8ebc-49497cff7b37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742bbed66a04e7b4f7db672f2ed05a32c1592384661edebf72f01e9b1b7d0eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:58Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.931163 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:58Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.953024 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:58Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:58 crc kubenswrapper[4689]: I0307 04:21:58.982356 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:58Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.005102 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.027681 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.052752 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.073395 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.091771 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.108284 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.690022 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/3.log" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.695066 4689 scope.go:117] "RemoveContainer" containerID="3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a" Mar 07 04:21:59 crc kubenswrapper[4689]: E0307 04:21:59.695361 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.720644 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.740024 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9645983-f81a-4162-8ebc-49497cff7b37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742bbed66a04e7b4f7db672f2ed05a32c1592384661edebf72f01e9b1b7d0eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.756933 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.777409 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.802503 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.822426 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.841465 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.857664 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.875548 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.890021 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.906869 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.928209 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55122838-d44d-4686-a0ba-93e8f85122e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2c10b5babd421667e41329cf7d752810507842c003e9b0e24c07c59e3e866b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2145120e262bbc6fd4876167d2bc0bd4f23ca467a1ab81f57e8df919c721c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dcba7bfbb1a5097afa4c8643d0fdc845439b5107877e8689daed2072d34e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.949384 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.964900 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:21:59 crc kubenswrapper[4689]: I0307 04:21:59.989775 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:49Z\\\",\\\"message\\\":\\\"2026-03-07T04:21:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95\\\\n2026-03-07T04:21:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95 to /host/opt/cni/bin/\\\\n2026-03-07T04:21:04Z [verbose] multus-daemon started\\\\n2026-03-07T04:21:04Z [verbose] Readiness Indicator file check\\\\n2026-03-07T04:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:59Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:00 crc kubenswrapper[4689]: I0307 04:22:00.023720 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:57Z\\\",\\\"message\\\":\\\"tart default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z]\\\\nI0307 04:21:57.834479 7154 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-scheduler-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"1dc899db-4498-4b7a-8437-861940b962e7\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:00Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:00 crc kubenswrapper[4689]: I0307 04:22:00.047876 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:00Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:00 crc kubenswrapper[4689]: I0307 04:22:00.068139 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:00Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:00 crc kubenswrapper[4689]: I0307 04:22:00.825782 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:00 crc kubenswrapper[4689]: I0307 04:22:00.825830 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:00 crc kubenswrapper[4689]: I0307 04:22:00.825782 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:00 crc kubenswrapper[4689]: E0307 04:22:00.825970 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:00 crc kubenswrapper[4689]: I0307 04:22:00.825926 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:00 crc kubenswrapper[4689]: E0307 04:22:00.826058 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:00 crc kubenswrapper[4689]: E0307 04:22:00.826268 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:00 crc kubenswrapper[4689]: E0307 04:22:00.826546 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:00 crc kubenswrapper[4689]: E0307 04:22:00.945942 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:22:02 crc kubenswrapper[4689]: I0307 04:22:02.785598 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.785809 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.78577001 +0000 UTC m=+231.832153569 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:22:02 crc kubenswrapper[4689]: I0307 04:22:02.785886 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:02 crc kubenswrapper[4689]: I0307 04:22:02.785944 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:02 crc kubenswrapper[4689]: I0307 04:22:02.785994 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:02 crc kubenswrapper[4689]: I0307 04:22:02.786046 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.786113 4689 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.786204 4689 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.786265 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.786232853 +0000 UTC m=+231.832616382 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.786226 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.786335 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.786305785 +0000 UTC m=+231.832689284 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.786336 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.786359 4689 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.786404 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.786391658 +0000 UTC m=+231.832775247 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.786486 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.786541 4689 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.786563 4689 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.786671 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.786642195 +0000 UTC m=+231.833025884 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 04:22:02 crc kubenswrapper[4689]: I0307 04:22:02.825869 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:02 crc kubenswrapper[4689]: I0307 04:22:02.825943 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:02 crc kubenswrapper[4689]: I0307 04:22:02.826009 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:02 crc kubenswrapper[4689]: I0307 04:22:02.826076 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.826263 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.826424 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.826651 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:02 crc kubenswrapper[4689]: E0307 04:22:02.826746 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:04 crc kubenswrapper[4689]: I0307 04:22:04.825022 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:04 crc kubenswrapper[4689]: I0307 04:22:04.825022 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:04 crc kubenswrapper[4689]: E0307 04:22:04.825819 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:04 crc kubenswrapper[4689]: I0307 04:22:04.825236 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:04 crc kubenswrapper[4689]: I0307 04:22:04.825147 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:04 crc kubenswrapper[4689]: E0307 04:22:04.826322 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:04 crc kubenswrapper[4689]: E0307 04:22:04.826456 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:04 crc kubenswrapper[4689]: E0307 04:22:04.826594 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.843961 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8473eab1f07fdd80e8c16ed1cc479197e25e1b2285faf3ba576847d343132eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4441c2498ac1a7c93fc65c206f956f2cc2bc48bde8eb64d1a90cec230e8f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.868209 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7032053e21da504eebb4b5a1763d403084cb7ceb9f9aa2a742656597b92c9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.896958 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9vncl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d529fad8-a51c-42d5-bdf1-3abb3ec3e85a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c371c1a7154bcee3c9c2470f892696c796cfc736f5db203ef6536d47edccce40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fh6hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9vncl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.911002 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95vzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e0e2e8-673a-446e-b377-f30ffd8edd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trdmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95vzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.926188 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb68519a84996903109b384671584a1c18ba370d438f35009959ee3edc16bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.938960 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6e9469a-474b-45c6-b3bd-638cb7a2e226\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://586f532b9be5c9935ffc43ad187ceb5258ee6d5f31da56730a83810c778b95a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgf5b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dss5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:05 crc kubenswrapper[4689]: E0307 04:22:05.946684 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.954579 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wmhqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5508b217-e634-41a8-813a-65ae39d7ea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:49Z\\\",\\\"message\\\":\\\"2026-03-07T04:21:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95\\\\n2026-03-07T04:21:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9507d353-a443-4f9e-9eb3-a9211cabfb95 to /host/opt/cni/bin/\\\\n2026-03-07T04:21:04Z [verbose] multus-daemon started\\\\n2026-03-07T04:21:04Z [verbose] Readiness Indicator file check\\\\n2026-03-07T04:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6zh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wmhqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.983839 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee6653df-cf05-46a7-9187-97bfc3c5b849\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T04:21:57Z\\\",\\\"message\\\":\\\"tart default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:21:57Z is after 2025-08-24T17:21:41Z]\\\\nI0307 04:21:57.834479 7154 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-scheduler-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"1dc899db-4498-4b7a-8437-861940b962e7\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:20:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2hpp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j9bx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.997515 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.997580 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.997596 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.997629 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:22:05 crc kubenswrapper[4689]: I0307 04:22:05.997642 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:22:05Z","lastTransitionTime":"2026-03-07T04:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.000046 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55122838-d44d-4686-a0ba-93e8f85122e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2c10b5babd421667e41329cf7d752810507842c003e9b0e24c07c59e3e866b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2145120e262bbc6fd4876167d2bc0bd4f23ca467a1ab81f57e8df919c721c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dcba7bfbb1a5097afa4c8643d0fdc845439b5107877e8689daed2072d34e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53d43cb6e683befdcf0c8a7e1d2793f9d1e4b797367c417f173762c299b53c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:05Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: E0307 04:22:06.013579 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.015370 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.018690 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.018762 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.018786 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.018819 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.018842 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:22:06Z","lastTransitionTime":"2026-03-07T04:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.031926 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"714c5fd0-3ab6-4d74-82ce-2e21630ace7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65168891a0da4615240dd538c725732ba4d33728e9c18de569672a5d8d5c1e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac3f372e81a5b2f04ec5e1264823079800d3d62289b6eaea2cfae2c426e72f24\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:19:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 04:19:18.536615 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 04:19:18.541033 1 observer_polling.go:159] Starting file observer\\\\nI0307 04:19:18.586819 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 04:19:18.595380 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0307 04:19:48.827881 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:19:47Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01d15790e4c62cf87439314dbaa94e9df0ac09f0badd6f7ca0a20a9d810b9e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9734ded9b1fa0835a428521b21a7dc5d2c607dbf55603d3840e16617f8afae1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: E0307 04:22:06.034043 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.037477 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.037521 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.037534 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.037553 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.037568 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:22:06Z","lastTransitionTime":"2026-03-07T04:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.045709 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a8fe40-7781-4819-bb57-f52325e9fcc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58685a0d9f13c7dca982e327332467ecdce6a53eb3d513eafbe4efa5720124e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://882e803d8455cb1f09a5ddcf358ff808c5f8bd2ee61ecb6f9be6c0bf9233f8ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdpsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:21:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mxsgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: E0307 04:22:06.049671 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.053445 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.053488 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.053498 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.053515 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.053526 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:22:06Z","lastTransitionTime":"2026-03-07T04:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.057143 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: E0307 04:22:06.065341 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.068910 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.069359 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.069435 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.069449 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.069471 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.069482 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:22:06Z","lastTransitionTime":"2026-03-07T04:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:22:06 crc kubenswrapper[4689]: E0307 04:22:06.087244 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T04:22:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f61933f-c340-4249-a24a-1d8f57f94460\\\",\\\"systemUUID\\\":\\\"6d63441d-81be-4ce6-837e-b2f91e86c31f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: E0307 04:22:06.087367 4689 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.088398 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c70eda-8745-4c02-93db-062597d2dbc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecf0515f779db576e106dfe01d5363a7989a9751af6666afc855b67479f961dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ba91b8c5f93a10c964638306192a7452d89f0b0741edd04e1468eef1016a61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://389c37c7fe89ef4893aa976ac2bb0fc5fea9d3ff0ac58ed28ca45e438502437b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac879c35637c9e9f552a2d1b7460f3abd21b108211bf669fd5a41d4a76b60499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2357f79869d9ace69644dc10ccbd2e94f64a1721799ca78d9a08a339d212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0349a3a285c5ad557d12478c300a0b7739fed76381ede2a588b1aead8b2b454f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce0075d1e2cf32b3570b098dac6f531f0be8a5b9fabfe8350ec43f52bde21c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bzq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gjvmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.100113 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bxdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bffa53b-77e7-4859-bd19-cd5fae877d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4f5334c7244365b020b89225a84ac4f01840e8ee8a30ecf600c508c669ff040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h682w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:20:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bxdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.116046 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62d0d3-38fb-407a-89b0-9ba3a380c851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T04:20:21Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 04:20:21.311561 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 04:20:21.311820 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 04:20:21.312685 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2790203443/tls.crt::/tmp/serving-cert-2790203443/tls.key\\\\\\\"\\\\nI0307 04:20:21.660266 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 04:20:21.664601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 04:20:21.664634 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 04:20:21.664666 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 04:20:21.664673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 04:20:21.676690 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 04:20:21.676723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 04:20:21.676733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 04:20:21.676737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 04:20:21.676740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 04:20:21.676743 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 04:20:21.676959 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 04:20:21.679692 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T04:20:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.129342 4689 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9645983-f81a-4162-8ebc-49497cff7b37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T04:19:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742bbed66a04e7b4f7db672f2ed05a32c1592384661edebf72f01e9b1b7d0eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T04:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d90922b5155512ad8238a3b208c09ae7c8a2863b96a8b1350892d62b2622ab61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T04:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T04:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T04:19:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T04:22:06Z is after 2025-08-24T17:21:41Z" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.825816 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.825926 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.825844 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:06 crc kubenswrapper[4689]: E0307 04:22:06.826041 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:06 crc kubenswrapper[4689]: I0307 04:22:06.825804 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:06 crc kubenswrapper[4689]: E0307 04:22:06.826314 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:06 crc kubenswrapper[4689]: E0307 04:22:06.826461 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:06 crc kubenswrapper[4689]: E0307 04:22:06.826925 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:08 crc kubenswrapper[4689]: I0307 04:22:08.825416 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:08 crc kubenswrapper[4689]: I0307 04:22:08.825452 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:08 crc kubenswrapper[4689]: I0307 04:22:08.825506 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:08 crc kubenswrapper[4689]: I0307 04:22:08.825520 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:08 crc kubenswrapper[4689]: E0307 04:22:08.826467 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:08 crc kubenswrapper[4689]: E0307 04:22:08.826766 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:08 crc kubenswrapper[4689]: E0307 04:22:08.826837 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:08 crc kubenswrapper[4689]: E0307 04:22:08.826928 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:09 crc kubenswrapper[4689]: I0307 04:22:09.057229 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:09 crc kubenswrapper[4689]: E0307 04:22:09.057853 4689 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:22:09 crc kubenswrapper[4689]: E0307 04:22:09.058049 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs podName:16e0e2e8-673a-446e-b377-f30ffd8edd1f nodeName:}" failed. No retries permitted until 2026-03-07 04:23:13.058022588 +0000 UTC m=+238.104406077 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs") pod "network-metrics-daemon-95vzv" (UID: "16e0e2e8-673a-446e-b377-f30ffd8edd1f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 04:22:10 crc kubenswrapper[4689]: I0307 04:22:10.824838 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:10 crc kubenswrapper[4689]: E0307 04:22:10.825045 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:10 crc kubenswrapper[4689]: I0307 04:22:10.826068 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:10 crc kubenswrapper[4689]: E0307 04:22:10.826232 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:10 crc kubenswrapper[4689]: I0307 04:22:10.826300 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:10 crc kubenswrapper[4689]: I0307 04:22:10.826479 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:10 crc kubenswrapper[4689]: E0307 04:22:10.826473 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:10 crc kubenswrapper[4689]: E0307 04:22:10.826551 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:10 crc kubenswrapper[4689]: I0307 04:22:10.843819 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 07 04:22:10 crc kubenswrapper[4689]: E0307 04:22:10.948379 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:22:12 crc kubenswrapper[4689]: I0307 04:22:12.824936 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:12 crc kubenswrapper[4689]: I0307 04:22:12.825015 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:12 crc kubenswrapper[4689]: I0307 04:22:12.825029 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:12 crc kubenswrapper[4689]: I0307 04:22:12.825107 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:12 crc kubenswrapper[4689]: E0307 04:22:12.825643 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:12 crc kubenswrapper[4689]: E0307 04:22:12.825819 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:12 crc kubenswrapper[4689]: E0307 04:22:12.825934 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:12 crc kubenswrapper[4689]: E0307 04:22:12.826056 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:13 crc kubenswrapper[4689]: I0307 04:22:13.827410 4689 scope.go:117] "RemoveContainer" containerID="3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a" Mar 07 04:22:13 crc kubenswrapper[4689]: E0307 04:22:13.827611 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" Mar 07 04:22:14 crc kubenswrapper[4689]: I0307 04:22:14.825055 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:14 crc kubenswrapper[4689]: I0307 04:22:14.825099 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:14 crc kubenswrapper[4689]: I0307 04:22:14.825073 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:14 crc kubenswrapper[4689]: I0307 04:22:14.825074 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:14 crc kubenswrapper[4689]: E0307 04:22:14.825302 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:14 crc kubenswrapper[4689]: E0307 04:22:14.825420 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:14 crc kubenswrapper[4689]: E0307 04:22:14.825565 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:14 crc kubenswrapper[4689]: E0307 04:22:14.825703 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:15 crc kubenswrapper[4689]: I0307 04:22:15.877125 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=71.877093659 podStartE2EDuration="1m11.877093659s" podCreationTimestamp="2026-03-07 04:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:15.877006607 +0000 UTC m=+180.923390156" watchObservedRunningTime="2026-03-07 04:22:15.877093659 +0000 UTC m=+180.923477178" Mar 07 04:22:15 crc kubenswrapper[4689]: I0307 04:22:15.916240 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=32.91621453 podStartE2EDuration="32.91621453s" podCreationTimestamp="2026-03-07 04:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:15.893813904 +0000 UTC m=+180.940197493" watchObservedRunningTime="2026-03-07 04:22:15.91621453 +0000 UTC m=+180.962598029" Mar 07 04:22:15 crc kubenswrapper[4689]: E0307 04:22:15.949307 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:22:15 crc kubenswrapper[4689]: I0307 04:22:15.974658 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gjvmk" podStartSLOduration=116.974629194 podStartE2EDuration="1m56.974629194s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:15.960863519 +0000 UTC m=+181.007247048" watchObservedRunningTime="2026-03-07 04:22:15.974629194 +0000 UTC m=+181.021012703" Mar 07 04:22:15 crc kubenswrapper[4689]: I0307 04:22:15.975135 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9bxdn" podStartSLOduration=117.97512876 podStartE2EDuration="1m57.97512876s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:15.974599163 +0000 UTC m=+181.020982652" watchObservedRunningTime="2026-03-07 04:22:15.97512876 +0000 UTC m=+181.021512259" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.001699 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=6.001677821 podStartE2EDuration="6.001677821s" podCreationTimestamp="2026-03-07 04:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:16.00064801 +0000 UTC m=+181.047031529" watchObservedRunningTime="2026-03-07 04:22:16.001677821 +0000 UTC m=+181.048061320" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.066757 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9vncl" podStartSLOduration=118.066734385 podStartE2EDuration="1m58.066734385s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:16.066606592 +0000 UTC m=+181.112990081" watchObservedRunningTime="2026-03-07 04:22:16.066734385 +0000 UTC m=+181.113117874" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.094111 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=46.094092982 podStartE2EDuration="46.094092982s" podCreationTimestamp="2026-03-07 04:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:16.093566105 +0000 UTC m=+181.139949594" watchObservedRunningTime="2026-03-07 04:22:16.094092982 +0000 UTC m=+181.140476461" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.139085 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podStartSLOduration=118.1390674 podStartE2EDuration="1m58.1390674s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:16.122339635 +0000 UTC m=+181.168723124" watchObservedRunningTime="2026-03-07 04:22:16.1390674 +0000 UTC m=+181.185450889" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.163945 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wmhqx" podStartSLOduration=117.163928441 podStartE2EDuration="1m57.163928441s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:16.139536094 +0000 UTC m=+181.185919583" watchObservedRunningTime="2026-03-07 04:22:16.163928441 +0000 UTC m=+181.210311950" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.196103 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=53.196081302 podStartE2EDuration="53.196081302s" podCreationTimestamp="2026-03-07 04:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:16.182559723 +0000 UTC m=+181.228943242" watchObservedRunningTime="2026-03-07 04:22:16.196081302 +0000 UTC m=+181.242464801" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.212520 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.212791 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.212891 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.212987 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.213072 4689 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T04:22:16Z","lastTransitionTime":"2026-03-07T04:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.286824 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mxsgf" podStartSLOduration=117.286801291 podStartE2EDuration="1m57.286801291s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:16.196640338 +0000 UTC m=+181.243023847" watchObservedRunningTime="2026-03-07 04:22:16.286801291 +0000 UTC m=+181.333184780" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.287970 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v"] Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.288439 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.291008 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.291006 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.291344 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.291825 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.349915 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.349979 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.350020 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.350036 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.350071 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.450782 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.450862 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.450962 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.451014 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.451094 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.451439 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.451469 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.452542 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.460437 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.467664 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wx44v\" (UID: \"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.603044 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" Mar 07 04:22:16 crc kubenswrapper[4689]: W0307 04:22:16.626921 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdbe0b16_8dcc_42aa_b918_b9beb64cb7dd.slice/crio-9f603d7bca10b913b3f726e6fe938648c94fc373a69cc7db4e378d90c2d52035 WatchSource:0}: Error finding container 9f603d7bca10b913b3f726e6fe938648c94fc373a69cc7db4e378d90c2d52035: Status 404 returned error can't find the container with id 9f603d7bca10b913b3f726e6fe938648c94fc373a69cc7db4e378d90c2d52035 Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.763659 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" event={"ID":"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd","Type":"ContainerStarted","Data":"9f603d7bca10b913b3f726e6fe938648c94fc373a69cc7db4e378d90c2d52035"} Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.825552 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.825590 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.825679 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:16 crc kubenswrapper[4689]: E0307 04:22:16.825867 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:16 crc kubenswrapper[4689]: E0307 04:22:16.826004 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:16 crc kubenswrapper[4689]: E0307 04:22:16.826132 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.826568 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:16 crc kubenswrapper[4689]: E0307 04:22:16.826693 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.881300 4689 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 07 04:22:16 crc kubenswrapper[4689]: I0307 04:22:16.894764 4689 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 07 04:22:17 crc kubenswrapper[4689]: I0307 04:22:17.769781 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" event={"ID":"bdbe0b16-8dcc-42aa-b918-b9beb64cb7dd","Type":"ContainerStarted","Data":"7a5b8c55c6cfc90ac93f45def21a2a0ec06720d19c7c06dc33f108f09f6ea91a"} Mar 07 04:22:18 crc kubenswrapper[4689]: I0307 04:22:18.825771 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:18 crc kubenswrapper[4689]: I0307 04:22:18.825818 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:18 crc kubenswrapper[4689]: I0307 04:22:18.825881 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:18 crc kubenswrapper[4689]: E0307 04:22:18.825975 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:18 crc kubenswrapper[4689]: I0307 04:22:18.826045 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:18 crc kubenswrapper[4689]: E0307 04:22:18.826204 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:18 crc kubenswrapper[4689]: E0307 04:22:18.826366 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:18 crc kubenswrapper[4689]: E0307 04:22:18.826474 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:20 crc kubenswrapper[4689]: I0307 04:22:20.825411 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:20 crc kubenswrapper[4689]: I0307 04:22:20.825458 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:20 crc kubenswrapper[4689]: E0307 04:22:20.825991 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:20 crc kubenswrapper[4689]: I0307 04:22:20.825601 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:20 crc kubenswrapper[4689]: E0307 04:22:20.826157 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:20 crc kubenswrapper[4689]: I0307 04:22:20.825555 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:20 crc kubenswrapper[4689]: E0307 04:22:20.826342 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:20 crc kubenswrapper[4689]: E0307 04:22:20.826456 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:20 crc kubenswrapper[4689]: E0307 04:22:20.950781 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:22:22 crc kubenswrapper[4689]: I0307 04:22:22.825791 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:22 crc kubenswrapper[4689]: I0307 04:22:22.825846 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:22 crc kubenswrapper[4689]: I0307 04:22:22.826010 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:22 crc kubenswrapper[4689]: I0307 04:22:22.826207 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:22 crc kubenswrapper[4689]: E0307 04:22:22.826164 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:22 crc kubenswrapper[4689]: E0307 04:22:22.826464 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:22 crc kubenswrapper[4689]: E0307 04:22:22.826615 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:22 crc kubenswrapper[4689]: E0307 04:22:22.826803 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:24 crc kubenswrapper[4689]: I0307 04:22:24.825414 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:24 crc kubenswrapper[4689]: I0307 04:22:24.825475 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:24 crc kubenswrapper[4689]: I0307 04:22:24.825528 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:24 crc kubenswrapper[4689]: I0307 04:22:24.825443 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:24 crc kubenswrapper[4689]: E0307 04:22:24.825699 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:24 crc kubenswrapper[4689]: E0307 04:22:24.826205 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:24 crc kubenswrapper[4689]: E0307 04:22:24.826336 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:24 crc kubenswrapper[4689]: E0307 04:22:24.826470 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:24 crc kubenswrapper[4689]: I0307 04:22:24.826518 4689 scope.go:117] "RemoveContainer" containerID="3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a" Mar 07 04:22:24 crc kubenswrapper[4689]: E0307 04:22:24.826690 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j9bx5_openshift-ovn-kubernetes(ee6653df-cf05-46a7-9187-97bfc3c5b849)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" Mar 07 04:22:25 crc kubenswrapper[4689]: E0307 04:22:25.951527 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:22:26 crc kubenswrapper[4689]: I0307 04:22:26.825570 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:26 crc kubenswrapper[4689]: I0307 04:22:26.825669 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:26 crc kubenswrapper[4689]: I0307 04:22:26.825662 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:26 crc kubenswrapper[4689]: I0307 04:22:26.825795 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:26 crc kubenswrapper[4689]: E0307 04:22:26.825992 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:26 crc kubenswrapper[4689]: E0307 04:22:26.826067 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:26 crc kubenswrapper[4689]: E0307 04:22:26.826315 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:26 crc kubenswrapper[4689]: E0307 04:22:26.826388 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:28 crc kubenswrapper[4689]: I0307 04:22:28.825376 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:28 crc kubenswrapper[4689]: I0307 04:22:28.825466 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:28 crc kubenswrapper[4689]: E0307 04:22:28.826777 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:28 crc kubenswrapper[4689]: I0307 04:22:28.825612 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:28 crc kubenswrapper[4689]: E0307 04:22:28.826875 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:28 crc kubenswrapper[4689]: I0307 04:22:28.825500 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:28 crc kubenswrapper[4689]: E0307 04:22:28.826971 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:28 crc kubenswrapper[4689]: E0307 04:22:28.827267 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:30 crc kubenswrapper[4689]: I0307 04:22:30.824763 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:30 crc kubenswrapper[4689]: I0307 04:22:30.824925 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:30 crc kubenswrapper[4689]: E0307 04:22:30.824974 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:30 crc kubenswrapper[4689]: E0307 04:22:30.825209 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:30 crc kubenswrapper[4689]: I0307 04:22:30.825302 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:30 crc kubenswrapper[4689]: E0307 04:22:30.825420 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:30 crc kubenswrapper[4689]: I0307 04:22:30.825479 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:30 crc kubenswrapper[4689]: E0307 04:22:30.825580 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:30 crc kubenswrapper[4689]: E0307 04:22:30.953229 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:22:32 crc kubenswrapper[4689]: I0307 04:22:32.825056 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:32 crc kubenswrapper[4689]: I0307 04:22:32.826283 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:32 crc kubenswrapper[4689]: E0307 04:22:32.826460 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:32 crc kubenswrapper[4689]: I0307 04:22:32.826549 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:32 crc kubenswrapper[4689]: E0307 04:22:32.826757 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:32 crc kubenswrapper[4689]: E0307 04:22:32.826891 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:32 crc kubenswrapper[4689]: I0307 04:22:32.827094 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:32 crc kubenswrapper[4689]: E0307 04:22:32.827477 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:34 crc kubenswrapper[4689]: I0307 04:22:34.842740 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:34 crc kubenswrapper[4689]: E0307 04:22:34.842967 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:34 crc kubenswrapper[4689]: I0307 04:22:34.843028 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:34 crc kubenswrapper[4689]: I0307 04:22:34.843122 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:34 crc kubenswrapper[4689]: I0307 04:22:34.843041 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:34 crc kubenswrapper[4689]: E0307 04:22:34.843332 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:34 crc kubenswrapper[4689]: E0307 04:22:34.843453 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:34 crc kubenswrapper[4689]: E0307 04:22:34.843618 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:35 crc kubenswrapper[4689]: I0307 04:22:35.852897 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wmhqx_5508b217-e634-41a8-813a-65ae39d7ea3d/kube-multus/1.log" Mar 07 04:22:35 crc kubenswrapper[4689]: I0307 04:22:35.854407 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wmhqx_5508b217-e634-41a8-813a-65ae39d7ea3d/kube-multus/0.log" Mar 07 04:22:35 crc kubenswrapper[4689]: I0307 04:22:35.854480 4689 generic.go:334] "Generic (PLEG): container finished" podID="5508b217-e634-41a8-813a-65ae39d7ea3d" containerID="4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5" exitCode=1 Mar 07 04:22:35 crc kubenswrapper[4689]: I0307 04:22:35.854520 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wmhqx" event={"ID":"5508b217-e634-41a8-813a-65ae39d7ea3d","Type":"ContainerDied","Data":"4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5"} Mar 07 04:22:35 crc kubenswrapper[4689]: I0307 04:22:35.854563 4689 scope.go:117] "RemoveContainer" containerID="733eeb45e2bbb699d306a2c580c0be277f134e6d97cec494762693b5f6d613dd" Mar 07 04:22:35 crc kubenswrapper[4689]: I0307 04:22:35.855287 4689 scope.go:117] "RemoveContainer" containerID="4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5" Mar 07 04:22:35 crc kubenswrapper[4689]: E0307 04:22:35.855645 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wmhqx_openshift-multus(5508b217-e634-41a8-813a-65ae39d7ea3d)\"" pod="openshift-multus/multus-wmhqx" podUID="5508b217-e634-41a8-813a-65ae39d7ea3d" Mar 07 04:22:35 crc kubenswrapper[4689]: I0307 04:22:35.881577 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wx44v" podStartSLOduration=137.881544987 podStartE2EDuration="2m17.881544987s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:17.79449344 +0000 UTC m=+182.840876929" watchObservedRunningTime="2026-03-07 04:22:35.881544987 +0000 UTC m=+200.927928526" Mar 07 04:22:35 crc kubenswrapper[4689]: E0307 04:22:35.954264 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:22:36 crc kubenswrapper[4689]: I0307 04:22:36.825191 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:36 crc kubenswrapper[4689]: I0307 04:22:36.825238 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:36 crc kubenswrapper[4689]: I0307 04:22:36.825212 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:36 crc kubenswrapper[4689]: E0307 04:22:36.825370 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:36 crc kubenswrapper[4689]: I0307 04:22:36.825396 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:36 crc kubenswrapper[4689]: E0307 04:22:36.825511 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:36 crc kubenswrapper[4689]: E0307 04:22:36.825593 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:36 crc kubenswrapper[4689]: E0307 04:22:36.825819 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:36 crc kubenswrapper[4689]: I0307 04:22:36.860794 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wmhqx_5508b217-e634-41a8-813a-65ae39d7ea3d/kube-multus/1.log" Mar 07 04:22:38 crc kubenswrapper[4689]: I0307 04:22:38.825398 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:38 crc kubenswrapper[4689]: I0307 04:22:38.825531 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:38 crc kubenswrapper[4689]: I0307 04:22:38.825533 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:38 crc kubenswrapper[4689]: I0307 04:22:38.825398 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:38 crc kubenswrapper[4689]: E0307 04:22:38.825787 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:38 crc kubenswrapper[4689]: E0307 04:22:38.826025 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:38 crc kubenswrapper[4689]: E0307 04:22:38.826207 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:38 crc kubenswrapper[4689]: E0307 04:22:38.826241 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:39 crc kubenswrapper[4689]: I0307 04:22:39.826833 4689 scope.go:117] "RemoveContainer" containerID="3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a" Mar 07 04:22:40 crc kubenswrapper[4689]: I0307 04:22:40.825379 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:40 crc kubenswrapper[4689]: I0307 04:22:40.825408 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:40 crc kubenswrapper[4689]: E0307 04:22:40.826064 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:40 crc kubenswrapper[4689]: I0307 04:22:40.825485 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:40 crc kubenswrapper[4689]: I0307 04:22:40.825427 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:40 crc kubenswrapper[4689]: E0307 04:22:40.826323 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:40 crc kubenswrapper[4689]: E0307 04:22:40.826541 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:40 crc kubenswrapper[4689]: E0307 04:22:40.826703 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:40 crc kubenswrapper[4689]: I0307 04:22:40.879931 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/3.log" Mar 07 04:22:40 crc kubenswrapper[4689]: I0307 04:22:40.884349 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerStarted","Data":"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196"} Mar 07 04:22:40 crc kubenswrapper[4689]: I0307 04:22:40.885220 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:22:40 crc kubenswrapper[4689]: I0307 04:22:40.936306 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-95vzv"] Mar 07 04:22:40 crc kubenswrapper[4689]: I0307 04:22:40.936423 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:40 crc kubenswrapper[4689]: E0307 04:22:40.936509 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:40 crc kubenswrapper[4689]: I0307 04:22:40.939827 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podStartSLOduration=141.939808578 podStartE2EDuration="2m21.939808578s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:22:40.939678345 +0000 UTC m=+205.986061854" watchObservedRunningTime="2026-03-07 04:22:40.939808578 +0000 UTC m=+205.986192087" Mar 07 04:22:40 crc kubenswrapper[4689]: E0307 04:22:40.955431 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:22:42 crc kubenswrapper[4689]: I0307 04:22:42.825606 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:42 crc kubenswrapper[4689]: I0307 04:22:42.825736 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:42 crc kubenswrapper[4689]: I0307 04:22:42.825841 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:42 crc kubenswrapper[4689]: E0307 04:22:42.826031 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:42 crc kubenswrapper[4689]: I0307 04:22:42.826063 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:42 crc kubenswrapper[4689]: E0307 04:22:42.826261 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:42 crc kubenswrapper[4689]: E0307 04:22:42.826383 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:42 crc kubenswrapper[4689]: E0307 04:22:42.826547 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:44 crc kubenswrapper[4689]: I0307 04:22:44.825694 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:44 crc kubenswrapper[4689]: I0307 04:22:44.825713 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:44 crc kubenswrapper[4689]: E0307 04:22:44.826463 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:44 crc kubenswrapper[4689]: I0307 04:22:44.825762 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:44 crc kubenswrapper[4689]: I0307 04:22:44.825735 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:44 crc kubenswrapper[4689]: E0307 04:22:44.826591 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:44 crc kubenswrapper[4689]: E0307 04:22:44.826699 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:44 crc kubenswrapper[4689]: E0307 04:22:44.826875 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:45 crc kubenswrapper[4689]: E0307 04:22:45.956300 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:22:46 crc kubenswrapper[4689]: I0307 04:22:46.825237 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:46 crc kubenswrapper[4689]: I0307 04:22:46.825298 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:46 crc kubenswrapper[4689]: I0307 04:22:46.825154 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:46 crc kubenswrapper[4689]: I0307 04:22:46.825255 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:46 crc kubenswrapper[4689]: E0307 04:22:46.825517 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:46 crc kubenswrapper[4689]: E0307 04:22:46.825637 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:46 crc kubenswrapper[4689]: E0307 04:22:46.825782 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:46 crc kubenswrapper[4689]: E0307 04:22:46.825984 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:48 crc kubenswrapper[4689]: I0307 04:22:48.825468 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:48 crc kubenswrapper[4689]: I0307 04:22:48.825559 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:48 crc kubenswrapper[4689]: I0307 04:22:48.825775 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:48 crc kubenswrapper[4689]: I0307 04:22:48.825820 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:48 crc kubenswrapper[4689]: E0307 04:22:48.825989 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:48 crc kubenswrapper[4689]: E0307 04:22:48.826132 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:48 crc kubenswrapper[4689]: E0307 04:22:48.826280 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:48 crc kubenswrapper[4689]: E0307 04:22:48.826546 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:50 crc kubenswrapper[4689]: I0307 04:22:50.825767 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:50 crc kubenswrapper[4689]: I0307 04:22:50.825896 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:50 crc kubenswrapper[4689]: E0307 04:22:50.826052 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:50 crc kubenswrapper[4689]: I0307 04:22:50.826090 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:50 crc kubenswrapper[4689]: I0307 04:22:50.826217 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:50 crc kubenswrapper[4689]: E0307 04:22:50.826291 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:50 crc kubenswrapper[4689]: E0307 04:22:50.826419 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:50 crc kubenswrapper[4689]: E0307 04:22:50.826669 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:50 crc kubenswrapper[4689]: I0307 04:22:50.827325 4689 scope.go:117] "RemoveContainer" containerID="4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5" Mar 07 04:22:50 crc kubenswrapper[4689]: E0307 04:22:50.957852 4689 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 04:22:51 crc kubenswrapper[4689]: I0307 04:22:51.938335 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wmhqx_5508b217-e634-41a8-813a-65ae39d7ea3d/kube-multus/1.log" Mar 07 04:22:51 crc kubenswrapper[4689]: I0307 04:22:51.938428 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wmhqx" event={"ID":"5508b217-e634-41a8-813a-65ae39d7ea3d","Type":"ContainerStarted","Data":"893297981dd6ce3d3fbe960d1e5b7c6adc5bb2f18dcfd916b37cf25761cff3d9"} Mar 07 04:22:52 crc kubenswrapper[4689]: I0307 04:22:52.825567 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:52 crc kubenswrapper[4689]: I0307 04:22:52.825600 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:52 crc kubenswrapper[4689]: I0307 04:22:52.825662 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:52 crc kubenswrapper[4689]: I0307 04:22:52.825841 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:52 crc kubenswrapper[4689]: E0307 04:22:52.826600 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:52 crc kubenswrapper[4689]: E0307 04:22:52.826802 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:52 crc kubenswrapper[4689]: E0307 04:22:52.826925 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:52 crc kubenswrapper[4689]: E0307 04:22:52.826467 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:54 crc kubenswrapper[4689]: I0307 04:22:54.825697 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:54 crc kubenswrapper[4689]: I0307 04:22:54.825749 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:54 crc kubenswrapper[4689]: I0307 04:22:54.825756 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:54 crc kubenswrapper[4689]: I0307 04:22:54.825693 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:54 crc kubenswrapper[4689]: E0307 04:22:54.825913 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 04:22:54 crc kubenswrapper[4689]: E0307 04:22:54.826098 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95vzv" podUID="16e0e2e8-673a-446e-b377-f30ffd8edd1f" Mar 07 04:22:54 crc kubenswrapper[4689]: E0307 04:22:54.826245 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 04:22:54 crc kubenswrapper[4689]: E0307 04:22:54.826386 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 04:22:56 crc kubenswrapper[4689]: I0307 04:22:56.825140 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:22:56 crc kubenswrapper[4689]: I0307 04:22:56.825250 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:22:56 crc kubenswrapper[4689]: I0307 04:22:56.825313 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:22:56 crc kubenswrapper[4689]: I0307 04:22:56.825367 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:22:56 crc kubenswrapper[4689]: I0307 04:22:56.830320 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 07 04:22:56 crc kubenswrapper[4689]: I0307 04:22:56.830412 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 07 04:22:56 crc kubenswrapper[4689]: I0307 04:22:56.831190 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 07 04:22:56 crc kubenswrapper[4689]: I0307 04:22:56.831389 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 07 04:22:56 crc kubenswrapper[4689]: I0307 04:22:56.831821 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 07 04:22:56 crc kubenswrapper[4689]: I0307 04:22:56.832599 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 07 04:22:56 crc kubenswrapper[4689]: I0307 04:22:56.946634 4689 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:56.997305 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:56.997769 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:56.999663 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:56.999871 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.002638 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.003015 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.005377 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9x8l6"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.005595 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fcn6x"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.005782 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h6hq2"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.006271 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.006966 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.007229 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.007389 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.007710 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.008247 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.011294 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.011757 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.012009 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.013349 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.013559 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.018651 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.022912 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8ggcp"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.023236 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.023340 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zw6mx"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.023652 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.024054 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.024328 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.024466 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.024542 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.025218 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.025879 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-prpp8"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.026517 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.026659 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-nnnmk"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.026983 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.027389 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nnnmk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.028588 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.029285 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.045774 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.053338 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.053581 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.053754 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.054018 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.054385 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.054889 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.054979 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.055148 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.056083 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.056705 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.057378 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.063049 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085337 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085586 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085639 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085678 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5t86\" (UniqueName: \"kubernetes.io/projected/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-kube-api-access-w5t86\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085703 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085745 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d79bc2b-a849-4d82-bc59-197431e014db-serving-cert\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085768 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-audit-dir\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085785 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085807 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085834 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085881 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72404506-4e6f-4494-a61a-2ac56bd6b123-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hzrqj\" (UID: \"72404506-4e6f-4494-a61a-2ac56bd6b123\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085899 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085915 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afea7082-9f6d-4c1f-a9be-ad1444e1459e-config\") pod \"route-controller-manager-6576b87f9c-2mqfk\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085929 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h52md\" (UniqueName: \"kubernetes.io/projected/afea7082-9f6d-4c1f-a9be-ad1444e1459e-kube-api-access-h52md\") pod \"route-controller-manager-6576b87f9c-2mqfk\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085963 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085981 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qq6k\" (UniqueName: \"kubernetes.io/projected/7d79bc2b-a849-4d82-bc59-197431e014db-kube-api-access-4qq6k\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086005 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72404506-4e6f-4494-a61a-2ac56bd6b123-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hzrqj\" (UID: \"72404506-4e6f-4494-a61a-2ac56bd6b123\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086020 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086042 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-audit-policies\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086061 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afea7082-9f6d-4c1f-a9be-ad1444e1459e-client-ca\") pod \"route-controller-manager-6576b87f9c-2mqfk\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086092 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5ccn\" (UniqueName: \"kubernetes.io/projected/72404506-4e6f-4494-a61a-2ac56bd6b123-kube-api-access-p5ccn\") pod \"openshift-apiserver-operator-796bbdcf4f-hzrqj\" (UID: \"72404506-4e6f-4494-a61a-2ac56bd6b123\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086110 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086136 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-config\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086151 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-client-ca\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086191 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086214 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086232 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086251 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afea7082-9f6d-4c1f-a9be-ad1444e1459e-serving-cert\") pod \"route-controller-manager-6576b87f9c-2mqfk\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.085810 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.086515 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.087937 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.088089 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6xlwc"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.089014 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.090489 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.089702 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-j4z8p"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.091161 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.091217 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.089916 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6xlwc" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.092018 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.092497 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.092779 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.093020 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.091426 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.089405 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.089816 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.089890 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.089958 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.089300 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.094922 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.095021 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.095098 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.095200 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.096630 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.096812 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.096929 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.097090 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.097260 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.097339 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.097413 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.097467 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.097610 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.097693 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.097760 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.097779 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.097909 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.098077 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.098147 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.098159 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.098283 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.098294 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.098440 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.099293 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.100388 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.101324 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.102338 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.102787 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.102956 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.105325 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w89ns"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.105575 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.105770 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.106040 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.106053 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.106162 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.107725 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.116211 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4cbc9"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.116870 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.117645 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.118073 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.118356 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.118509 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.118535 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.119359 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.119644 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.120069 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.120808 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.120922 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.121415 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.121702 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.122215 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.125930 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.126213 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.126229 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.122505 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.126711 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.126014 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.143442 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.144083 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.148750 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.126339 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.123102 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.124748 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.124786 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.124823 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.124855 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.122747 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.157443 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.166838 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.168374 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.169088 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.170233 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.171056 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.172420 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.172698 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-7dvxk"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.173044 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.173773 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.174824 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.175088 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.175422 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.176278 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.176886 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.177615 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.177904 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.178347 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.178662 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tbpn9"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.179306 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-tbpn9" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.179468 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.180005 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.181067 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.181521 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m4p5r"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.181834 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.182060 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.182946 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4rlvc"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.183452 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4rlvc" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.185478 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.186088 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.186341 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.187042 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.189092 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5d0f9cf7-c781-4964-a714-bcd780e88285-console-oauth-config\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.191349 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5ccn\" (UniqueName: \"kubernetes.io/projected/72404506-4e6f-4494-a61a-2ac56bd6b123-kube-api-access-p5ccn\") pod \"openshift-apiserver-operator-796bbdcf4f-hzrqj\" (UID: \"72404506-4e6f-4494-a61a-2ac56bd6b123\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.191474 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.191575 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d0f9cf7-c781-4964-a714-bcd780e88285-console-serving-cert\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.191672 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/479d9009-47b4-4a26-990e-30e757c7aa17-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xpp56\" (UID: \"479d9009-47b4-4a26-990e-30e757c7aa17\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.191771 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-audit-dir\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.191868 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-config\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.191976 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-client-ca\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.192087 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnkds\" (UniqueName: \"kubernetes.io/projected/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-kube-api-access-gnkds\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.193366 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.193472 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5d0f9cf7-c781-4964-a714-bcd780e88285-console-config\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.193529 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-client-ca\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.193612 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-image-import-ca\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.193683 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.193765 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.193919 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afea7082-9f6d-4c1f-a9be-ad1444e1459e-serving-cert\") pod \"route-controller-manager-6576b87f9c-2mqfk\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.193993 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5d0f9cf7-c781-4964-a714-bcd780e88285-service-ca\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.194068 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-etcd-client\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.194156 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/479d9009-47b4-4a26-990e-30e757c7aa17-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xpp56\" (UID: \"479d9009-47b4-4a26-990e-30e757c7aa17\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.194279 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkrc5\" (UniqueName: \"kubernetes.io/projected/5d0f9cf7-c781-4964-a714-bcd780e88285-kube-api-access-nkrc5\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.194369 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.194441 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5t86\" (UniqueName: \"kubernetes.io/projected/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-kube-api-access-w5t86\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.194513 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-node-pullsecrets\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.194620 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.194698 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4a1ec5-fba3-4058-930e-96b000e4b052-config\") pod \"console-operator-58897d9998-prpp8\" (UID: \"6b4a1ec5-fba3-4058-930e-96b000e4b052\") " pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.194786 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d79bc2b-a849-4d82-bc59-197431e014db-serving-cert\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.194854 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e244dd83-cd20-40f8-a639-3164577c7316-serving-cert\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.194932 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e244dd83-cd20-40f8-a639-3164577c7316-etcd-client\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.195034 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e244dd83-cd20-40f8-a639-3164577c7316-etcd-ca\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.195125 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b4a1ec5-fba3-4058-930e-96b000e4b052-trusted-ca\") pod \"console-operator-58897d9998-prpp8\" (UID: \"6b4a1ec5-fba3-4058-930e-96b000e4b052\") " pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.195223 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-audit-dir\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.195309 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.195386 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.196335 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-serving-cert\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.196648 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5d0f9cf7-c781-4964-a714-bcd780e88285-oauth-serving-cert\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.196743 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-encryption-config\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.196853 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htlm7\" (UniqueName: \"kubernetes.io/projected/6b4a1ec5-fba3-4058-930e-96b000e4b052-kube-api-access-htlm7\") pod \"console-operator-58897d9998-prpp8\" (UID: \"6b4a1ec5-fba3-4058-930e-96b000e4b052\") " pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.196970 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74sft\" (UniqueName: \"kubernetes.io/projected/774b5998-29de-4546-937e-b5d2ee0b27d4-kube-api-access-74sft\") pod \"openshift-config-operator-7777fb866f-xfcf7\" (UID: \"774b5998-29de-4546-937e-b5d2ee0b27d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.197079 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.197211 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e244dd83-cd20-40f8-a639-3164577c7316-config\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.197332 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-config\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.197427 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72404506-4e6f-4494-a61a-2ac56bd6b123-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hzrqj\" (UID: \"72404506-4e6f-4494-a61a-2ac56bd6b123\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.197501 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.197569 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afea7082-9f6d-4c1f-a9be-ad1444e1459e-config\") pod \"route-controller-manager-6576b87f9c-2mqfk\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.197638 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h52md\" (UniqueName: \"kubernetes.io/projected/afea7082-9f6d-4c1f-a9be-ad1444e1459e-kube-api-access-h52md\") pod \"route-controller-manager-6576b87f9c-2mqfk\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.197718 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.197810 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/774b5998-29de-4546-937e-b5d2ee0b27d4-serving-cert\") pod \"openshift-config-operator-7777fb866f-xfcf7\" (UID: \"774b5998-29de-4546-937e-b5d2ee0b27d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.197886 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.197970 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qq6k\" (UniqueName: \"kubernetes.io/projected/7d79bc2b-a849-4d82-bc59-197431e014db-kube-api-access-4qq6k\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.198039 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-etcd-serving-ca\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.198110 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8shw9\" (UniqueName: \"kubernetes.io/projected/479d9009-47b4-4a26-990e-30e757c7aa17-kube-api-access-8shw9\") pod \"cluster-image-registry-operator-dc59b4c8b-xpp56\" (UID: \"479d9009-47b4-4a26-990e-30e757c7aa17\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.199053 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72404506-4e6f-4494-a61a-2ac56bd6b123-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hzrqj\" (UID: \"72404506-4e6f-4494-a61a-2ac56bd6b123\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.199149 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72404506-4e6f-4494-a61a-2ac56bd6b123-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hzrqj\" (UID: \"72404506-4e6f-4494-a61a-2ac56bd6b123\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.199283 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b4a1ec5-fba3-4058-930e-96b000e4b052-serving-cert\") pod \"console-operator-58897d9998-prpp8\" (UID: \"6b4a1ec5-fba3-4058-930e-96b000e4b052\") " pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.199402 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.199509 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj55s\" (UniqueName: \"kubernetes.io/projected/e244dd83-cd20-40f8-a639-3164577c7316-kube-api-access-sj55s\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.199610 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/774b5998-29de-4546-937e-b5d2ee0b27d4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xfcf7\" (UID: \"774b5998-29de-4546-937e-b5d2ee0b27d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.199708 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/479d9009-47b4-4a26-990e-30e757c7aa17-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xpp56\" (UID: \"479d9009-47b4-4a26-990e-30e757c7aa17\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.190947 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9x8l6"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.199904 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.200107 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.199306 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.200278 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-audit-policies\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.198716 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.198621 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.195268 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-config\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.199345 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-audit-dir\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.201280 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.199820 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-audit-policies\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.201595 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e244dd83-cd20-40f8-a639-3164577c7316-etcd-service-ca\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.201639 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-audit\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.201670 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afea7082-9f6d-4c1f-a9be-ad1444e1459e-client-ca\") pod \"route-controller-manager-6576b87f9c-2mqfk\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.201689 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d0f9cf7-c781-4964-a714-bcd780e88285-trusted-ca-bundle\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.202372 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afea7082-9f6d-4c1f-a9be-ad1444e1459e-config\") pod \"route-controller-manager-6576b87f9c-2mqfk\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.202947 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.203295 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afea7082-9f6d-4c1f-a9be-ad1444e1459e-serving-cert\") pod \"route-controller-manager-6576b87f9c-2mqfk\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.203899 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.204243 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.204592 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.204735 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.205213 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72404506-4e6f-4494-a61a-2ac56bd6b123-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hzrqj\" (UID: \"72404506-4e6f-4494-a61a-2ac56bd6b123\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.205300 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afea7082-9f6d-4c1f-a9be-ad1444e1459e-client-ca\") pod \"route-controller-manager-6576b87f9c-2mqfk\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.205614 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.205775 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.206102 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.207493 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d79bc2b-a849-4d82-bc59-197431e014db-serving-cert\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.207901 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.208431 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.208580 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zdmpn"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.222977 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.223541 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.223125 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.224539 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.241462 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.242058 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.242459 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.244106 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547622-4796h"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.244549 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.245374 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-djkqv"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.245920 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.245944 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h6hq2"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.245956 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fcn6x"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.245971 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xwr6f"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.246330 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547622-4796h" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.246569 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.247498 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xwr6f" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.246693 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8ggcp"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.247696 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.247720 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zw6mx"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.247735 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-prpp8"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.247752 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w89ns"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.247782 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6xlwc"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.247795 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.247810 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.247822 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.247834 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-j4z8p"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.250909 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.253585 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.255206 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.256706 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.257884 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tbpn9"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.259288 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.259348 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.262320 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.263995 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8ls5c"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.266050 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-q6whv"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.266217 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8ls5c" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.266962 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-q6whv" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.267127 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nnnmk"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.268533 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.269615 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.270959 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.272518 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4rlvc"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.274223 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.275059 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m4p5r"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.276702 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xwr6f"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.278369 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.279734 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4cbc9"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.280649 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.280912 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.282409 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.283813 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8ls5c"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.285119 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547622-4796h"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.286464 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zdmpn"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.288016 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.289310 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-djkqv"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.290592 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c6r5s"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.292351 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c6r5s"] Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.292475 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.300041 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.302853 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b4a1ec5-fba3-4058-930e-96b000e4b052-trusted-ca\") pod \"console-operator-58897d9998-prpp8\" (UID: \"6b4a1ec5-fba3-4058-930e-96b000e4b052\") " pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.302893 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-serving-cert\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.302920 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5d0f9cf7-c781-4964-a714-bcd780e88285-oauth-serving-cert\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.302942 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-encryption-config\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.302967 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htlm7\" (UniqueName: \"kubernetes.io/projected/6b4a1ec5-fba3-4058-930e-96b000e4b052-kube-api-access-htlm7\") pod \"console-operator-58897d9998-prpp8\" (UID: \"6b4a1ec5-fba3-4058-930e-96b000e4b052\") " pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.302991 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74sft\" (UniqueName: \"kubernetes.io/projected/774b5998-29de-4546-937e-b5d2ee0b27d4-kube-api-access-74sft\") pod \"openshift-config-operator-7777fb866f-xfcf7\" (UID: \"774b5998-29de-4546-937e-b5d2ee0b27d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303035 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e244dd83-cd20-40f8-a639-3164577c7316-config\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303060 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-config\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303132 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303159 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/774b5998-29de-4546-937e-b5d2ee0b27d4-serving-cert\") pod \"openshift-config-operator-7777fb866f-xfcf7\" (UID: \"774b5998-29de-4546-937e-b5d2ee0b27d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303212 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-etcd-serving-ca\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303256 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8shw9\" (UniqueName: \"kubernetes.io/projected/479d9009-47b4-4a26-990e-30e757c7aa17-kube-api-access-8shw9\") pod \"cluster-image-registry-operator-dc59b4c8b-xpp56\" (UID: \"479d9009-47b4-4a26-990e-30e757c7aa17\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303313 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b4a1ec5-fba3-4058-930e-96b000e4b052-serving-cert\") pod \"console-operator-58897d9998-prpp8\" (UID: \"6b4a1ec5-fba3-4058-930e-96b000e4b052\") " pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303344 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj55s\" (UniqueName: \"kubernetes.io/projected/e244dd83-cd20-40f8-a639-3164577c7316-kube-api-access-sj55s\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303402 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/774b5998-29de-4546-937e-b5d2ee0b27d4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xfcf7\" (UID: \"774b5998-29de-4546-937e-b5d2ee0b27d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303444 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/479d9009-47b4-4a26-990e-30e757c7aa17-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xpp56\" (UID: \"479d9009-47b4-4a26-990e-30e757c7aa17\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303479 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e244dd83-cd20-40f8-a639-3164577c7316-etcd-service-ca\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303502 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-audit\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303528 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d0f9cf7-c781-4964-a714-bcd780e88285-trusted-ca-bundle\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303557 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5d0f9cf7-c781-4964-a714-bcd780e88285-console-oauth-config\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303603 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d0f9cf7-c781-4964-a714-bcd780e88285-console-serving-cert\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303642 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-audit-dir\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303671 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/479d9009-47b4-4a26-990e-30e757c7aa17-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xpp56\" (UID: \"479d9009-47b4-4a26-990e-30e757c7aa17\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303721 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnkds\" (UniqueName: \"kubernetes.io/projected/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-kube-api-access-gnkds\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303748 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5d0f9cf7-c781-4964-a714-bcd780e88285-console-config\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303804 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-image-import-ca\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303832 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5d0f9cf7-c781-4964-a714-bcd780e88285-service-ca\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303855 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkrc5\" (UniqueName: \"kubernetes.io/projected/5d0f9cf7-c781-4964-a714-bcd780e88285-kube-api-access-nkrc5\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303881 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-etcd-client\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303924 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/479d9009-47b4-4a26-990e-30e757c7aa17-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xpp56\" (UID: \"479d9009-47b4-4a26-990e-30e757c7aa17\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.303962 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-node-pullsecrets\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.304119 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4a1ec5-fba3-4058-930e-96b000e4b052-config\") pod \"console-operator-58897d9998-prpp8\" (UID: \"6b4a1ec5-fba3-4058-930e-96b000e4b052\") " pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.304148 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e244dd83-cd20-40f8-a639-3164577c7316-serving-cert\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.304208 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e244dd83-cd20-40f8-a639-3164577c7316-etcd-client\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.304241 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e244dd83-cd20-40f8-a639-3164577c7316-etcd-ca\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.304747 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-audit-dir\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.304812 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-config\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.305232 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/774b5998-29de-4546-937e-b5d2ee0b27d4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xfcf7\" (UID: \"774b5998-29de-4546-937e-b5d2ee0b27d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.305239 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-audit\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.305286 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-node-pullsecrets\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.305600 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b4a1ec5-fba3-4058-930e-96b000e4b052-trusted-ca\") pod \"console-operator-58897d9998-prpp8\" (UID: \"6b4a1ec5-fba3-4058-930e-96b000e4b052\") " pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.306106 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-etcd-serving-ca\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.306303 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4a1ec5-fba3-4058-930e-96b000e4b052-config\") pod \"console-operator-58897d9998-prpp8\" (UID: \"6b4a1ec5-fba3-4058-930e-96b000e4b052\") " pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.306335 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5d0f9cf7-c781-4964-a714-bcd780e88285-service-ca\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.306364 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5d0f9cf7-c781-4964-a714-bcd780e88285-oauth-serving-cert\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.306905 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-image-import-ca\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.307106 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/479d9009-47b4-4a26-990e-30e757c7aa17-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xpp56\" (UID: \"479d9009-47b4-4a26-990e-30e757c7aa17\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.307268 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.308941 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-serving-cert\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.307554 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d0f9cf7-c781-4964-a714-bcd780e88285-trusted-ca-bundle\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.308571 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-etcd-client\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.307528 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5d0f9cf7-c781-4964-a714-bcd780e88285-console-config\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.309046 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b4a1ec5-fba3-4058-930e-96b000e4b052-serving-cert\") pod \"console-operator-58897d9998-prpp8\" (UID: \"6b4a1ec5-fba3-4058-930e-96b000e4b052\") " pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.309203 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/479d9009-47b4-4a26-990e-30e757c7aa17-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xpp56\" (UID: \"479d9009-47b4-4a26-990e-30e757c7aa17\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.309808 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-encryption-config\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.310410 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/774b5998-29de-4546-937e-b5d2ee0b27d4-serving-cert\") pod \"openshift-config-operator-7777fb866f-xfcf7\" (UID: \"774b5998-29de-4546-937e-b5d2ee0b27d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.311602 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d0f9cf7-c781-4964-a714-bcd780e88285-console-serving-cert\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.320068 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5d0f9cf7-c781-4964-a714-bcd780e88285-console-oauth-config\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.320105 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.340251 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.360310 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.379578 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.400717 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.420578 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.440477 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.460648 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.465387 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e244dd83-cd20-40f8-a639-3164577c7316-etcd-ca\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.480224 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.521516 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.525823 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e244dd83-cd20-40f8-a639-3164577c7316-etcd-service-ca\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.540448 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.559956 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.571256 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e244dd83-cd20-40f8-a639-3164577c7316-serving-cert\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.581542 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.600994 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.619973 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.625384 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e244dd83-cd20-40f8-a639-3164577c7316-config\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.641089 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.651648 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e244dd83-cd20-40f8-a639-3164577c7316-etcd-client\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.660165 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.680733 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.715365 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.729250 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.740422 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.760233 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.780657 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.802211 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.821324 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.840930 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.860865 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.880328 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.902293 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.921448 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.941413 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.960925 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 07 04:22:57 crc kubenswrapper[4689]: I0307 04:22:57.980570 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.001078 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.021217 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.040466 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.060798 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.081317 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.100985 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.121263 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.141742 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.161323 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.180612 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.199291 4689 request.go:700] Waited for 1.018796418s due to client-side throttling, not priority and fairness, request: PATCH:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fcn6x/status Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.223293 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.240934 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.260628 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.294800 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.301037 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.320844 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.340069 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.361233 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.381380 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.400471 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.421037 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.440383 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.460995 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.482269 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.501119 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.549224 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5ccn\" (UniqueName: \"kubernetes.io/projected/72404506-4e6f-4494-a61a-2ac56bd6b123-kube-api-access-p5ccn\") pod \"openshift-apiserver-operator-796bbdcf4f-hzrqj\" (UID: \"72404506-4e6f-4494-a61a-2ac56bd6b123\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.571918 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5t86\" (UniqueName: \"kubernetes.io/projected/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-kube-api-access-w5t86\") pod \"oauth-openshift-558db77b4-fcn6x\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.589689 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qq6k\" (UniqueName: \"kubernetes.io/projected/7d79bc2b-a849-4d82-bc59-197431e014db-kube-api-access-4qq6k\") pod \"controller-manager-879f6c89f-9x8l6\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.591813 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.602511 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.611127 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h52md\" (UniqueName: \"kubernetes.io/projected/afea7082-9f6d-4c1f-a9be-ad1444e1459e-kube-api-access-h52md\") pod \"route-controller-manager-6576b87f9c-2mqfk\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.621084 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.624954 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.642257 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.651807 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.661442 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.682066 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.701825 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.721569 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.773698 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.774017 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.785275 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.803343 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.820934 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.839820 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.849440 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.862667 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.880397 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.886849 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj"] Mar 07 04:22:58 crc kubenswrapper[4689]: W0307 04:22:58.900487 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72404506_4e6f_4494_a61a_2ac56bd6b123.slice/crio-97269ac69f0e8729bf953003b35bbe8d2b39632fc78ed43561e0e080884e9700 WatchSource:0}: Error finding container 97269ac69f0e8729bf953003b35bbe8d2b39632fc78ed43561e0e080884e9700: Status 404 returned error can't find the container with id 97269ac69f0e8729bf953003b35bbe8d2b39632fc78ed43561e0e080884e9700 Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.901223 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.920141 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.920326 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fcn6x"] Mar 07 04:22:58 crc kubenswrapper[4689]: W0307 04:22:58.931074 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4c3b676_f7ae_4659_a3f6_73dcc319bed8.slice/crio-3899f63dc0cadd52c511a0d4db8be92b4031bd10eb64625ad6cd57c7721a2027 WatchSource:0}: Error finding container 3899f63dc0cadd52c511a0d4db8be92b4031bd10eb64625ad6cd57c7721a2027: Status 404 returned error can't find the container with id 3899f63dc0cadd52c511a0d4db8be92b4031bd10eb64625ad6cd57c7721a2027 Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.939936 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.947654 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9x8l6"] Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.960727 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.967349 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" event={"ID":"e4c3b676-f7ae-4659-a3f6-73dcc319bed8","Type":"ContainerStarted","Data":"3899f63dc0cadd52c511a0d4db8be92b4031bd10eb64625ad6cd57c7721a2027"} Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.968468 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" event={"ID":"72404506-4e6f-4494-a61a-2ac56bd6b123","Type":"ContainerStarted","Data":"97269ac69f0e8729bf953003b35bbe8d2b39632fc78ed43561e0e080884e9700"} Mar 07 04:22:58 crc kubenswrapper[4689]: W0307 04:22:58.970982 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d79bc2b_a849_4d82_bc59_197431e014db.slice/crio-cfad0d99aa095414e88a7d6d2dd312e60da41fdd0c9b677431e98d15621fcc7e WatchSource:0}: Error finding container cfad0d99aa095414e88a7d6d2dd312e60da41fdd0c9b677431e98d15621fcc7e: Status 404 returned error can't find the container with id cfad0d99aa095414e88a7d6d2dd312e60da41fdd0c9b677431e98d15621fcc7e Mar 07 04:22:58 crc kubenswrapper[4689]: I0307 04:22:58.980615 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.002611 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.020662 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.040646 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.063428 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.065244 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk"] Mar 07 04:22:59 crc kubenswrapper[4689]: W0307 04:22:59.071819 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafea7082_9f6d_4c1f_a9be_ad1444e1459e.slice/crio-98e3c81c69dc17318e621cbd401420fb2cfae1b14a8ba8fb0b629a453883580b WatchSource:0}: Error finding container 98e3c81c69dc17318e621cbd401420fb2cfae1b14a8ba8fb0b629a453883580b: Status 404 returned error can't find the container with id 98e3c81c69dc17318e621cbd401420fb2cfae1b14a8ba8fb0b629a453883580b Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.079326 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.100823 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.121423 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.141520 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.160016 4689 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.180856 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.189728 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.189825 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.218251 4689 request.go:700] Waited for 1.914094619s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.226593 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htlm7\" (UniqueName: \"kubernetes.io/projected/6b4a1ec5-fba3-4058-930e-96b000e4b052-kube-api-access-htlm7\") pod \"console-operator-58897d9998-prpp8\" (UID: \"6b4a1ec5-fba3-4058-930e-96b000e4b052\") " pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.239093 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74sft\" (UniqueName: \"kubernetes.io/projected/774b5998-29de-4546-937e-b5d2ee0b27d4-kube-api-access-74sft\") pod \"openshift-config-operator-7777fb866f-xfcf7\" (UID: \"774b5998-29de-4546-937e-b5d2ee0b27d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.260161 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/479d9009-47b4-4a26-990e-30e757c7aa17-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xpp56\" (UID: \"479d9009-47b4-4a26-990e-30e757c7aa17\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.274342 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj55s\" (UniqueName: \"kubernetes.io/projected/e244dd83-cd20-40f8-a639-3164577c7316-kube-api-access-sj55s\") pod \"etcd-operator-b45778765-w89ns\" (UID: \"e244dd83-cd20-40f8-a639-3164577c7316\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.305627 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkrc5\" (UniqueName: \"kubernetes.io/projected/5d0f9cf7-c781-4964-a714-bcd780e88285-kube-api-access-nkrc5\") pod \"console-f9d7485db-j4z8p\" (UID: \"5d0f9cf7-c781-4964-a714-bcd780e88285\") " pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.310011 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.317883 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8shw9\" (UniqueName: \"kubernetes.io/projected/479d9009-47b4-4a26-990e-30e757c7aa17-kube-api-access-8shw9\") pod \"cluster-image-registry-operator-dc59b4c8b-xpp56\" (UID: \"479d9009-47b4-4a26-990e-30e757c7aa17\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.341075 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnkds\" (UniqueName: \"kubernetes.io/projected/8f06b111-b994-4bb2-b1f3-1033b5cde4aa-kube-api-access-gnkds\") pod \"apiserver-76f77b778f-h6hq2\" (UID: \"8f06b111-b994-4bb2-b1f3-1033b5cde4aa\") " pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.353832 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.363519 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.383515 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.396769 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437038 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e4bbf5e-dcd1-4e37-ab88-1ce0def71019-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n8r6f\" (UID: \"1e4bbf5e-dcd1-4e37-ab88-1ce0def71019\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437104 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb494b13-9120-4ff9-8349-48568da9e990-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-g5279\" (UID: \"bb494b13-9120-4ff9-8349-48568da9e990\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437352 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c0f499-79e0-4090-bfaa-3d8606e04925-config\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437440 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60af193a-2553-4f45-b190-c86e1e3594e1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437473 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fe2e7665-098b-4338-9ff3-f936514ebbb9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437540 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437564 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14c0f499-79e0-4090-bfaa-3d8606e04925-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437588 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec0b40d-04d4-486b-93bc-361c72d74aad-config\") pod \"machine-api-operator-5694c8668f-8ggcp\" (UID: \"3ec0b40d-04d4-486b-93bc-361c72d74aad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437611 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe2e7665-098b-4338-9ff3-f936514ebbb9-serving-cert\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437636 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60af193a-2553-4f45-b190-c86e1e3594e1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437658 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krwtj\" (UniqueName: \"kubernetes.io/projected/b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2-kube-api-access-krwtj\") pod \"openshift-controller-manager-operator-756b6f6bc6-qng8x\" (UID: \"b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437685 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3ec0b40d-04d4-486b-93bc-361c72d74aad-images\") pod \"machine-api-operator-5694c8668f-8ggcp\" (UID: \"3ec0b40d-04d4-486b-93bc-361c72d74aad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437705 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3ec0b40d-04d4-486b-93bc-361c72d74aad-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8ggcp\" (UID: \"3ec0b40d-04d4-486b-93bc-361c72d74aad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437723 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d065e112-052a-4e44-87f6-7713ebdfa2bd-metrics-tls\") pod \"dns-operator-744455d44c-6xlwc\" (UID: \"d065e112-052a-4e44-87f6-7713ebdfa2bd\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xlwc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437752 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e52c228-a5cf-4b90-a8bc-4926c2d58ec0-config\") pod \"machine-approver-56656f9798-g9xzr\" (UID: \"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437774 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14c0f499-79e0-4090-bfaa-3d8606e04925-serving-cert\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.437800 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e4bbf5e-dcd1-4e37-ab88-1ce0def71019-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n8r6f\" (UID: \"1e4bbf5e-dcd1-4e37-ab88-1ce0def71019\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" Mar 07 04:22:59 crc kubenswrapper[4689]: E0307 04:22:59.438539 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:22:59.938515435 +0000 UTC m=+224.984898924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.438628 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9rs8\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-kube-api-access-h9rs8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.438763 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hztpp\" (UniqueName: \"kubernetes.io/projected/7e52c228-a5cf-4b90-a8bc-4926c2d58ec0-kube-api-access-hztpp\") pod \"machine-approver-56656f9798-g9xzr\" (UID: \"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.438800 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60af193a-2553-4f45-b190-c86e1e3594e1-trusted-ca\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.438843 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rstc9\" (UniqueName: \"kubernetes.io/projected/37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b-kube-api-access-rstc9\") pod \"kube-storage-version-migrator-operator-b67b599dd-n9g9l\" (UID: \"37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.438953 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb494b13-9120-4ff9-8349-48568da9e990-config\") pod \"kube-apiserver-operator-766d6c64bb-g5279\" (UID: \"bb494b13-9120-4ff9-8349-48568da9e990\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439010 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qng8x\" (UID: \"b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439049 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe2e7665-098b-4338-9ff3-f936514ebbb9-audit-policies\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439089 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmvkg\" (UniqueName: \"kubernetes.io/projected/fe2e7665-098b-4338-9ff3-f936514ebbb9-kube-api-access-cmvkg\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439139 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w2lb\" (UniqueName: \"kubernetes.io/projected/423b5174-7bed-4fba-af44-51abd9188676-kube-api-access-8w2lb\") pod \"downloads-7954f5f757-nnnmk\" (UID: \"423b5174-7bed-4fba-af44-51abd9188676\") " pod="openshift-console/downloads-7954f5f757-nnnmk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439370 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e4bbf5e-dcd1-4e37-ab88-1ce0def71019-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n8r6f\" (UID: \"1e4bbf5e-dcd1-4e37-ab88-1ce0def71019\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439497 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qng8x\" (UID: \"b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439557 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-bound-sa-token\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439575 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fe2e7665-098b-4338-9ff3-f936514ebbb9-encryption-config\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439624 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z68gz\" (UniqueName: \"kubernetes.io/projected/3ec0b40d-04d4-486b-93bc-361c72d74aad-kube-api-access-z68gz\") pod \"machine-api-operator-5694c8668f-8ggcp\" (UID: \"3ec0b40d-04d4-486b-93bc-361c72d74aad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439655 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7qls\" (UniqueName: \"kubernetes.io/projected/14c0f499-79e0-4090-bfaa-3d8606e04925-kube-api-access-m7qls\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439754 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-n9g9l\" (UID: \"37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439771 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14c0f499-79e0-4090-bfaa-3d8606e04925-service-ca-bundle\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439799 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7e52c228-a5cf-4b90-a8bc-4926c2d58ec0-machine-approver-tls\") pod \"machine-approver-56656f9798-g9xzr\" (UID: \"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439828 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e52c228-a5cf-4b90-a8bc-4926c2d58ec0-auth-proxy-config\") pod \"machine-approver-56656f9798-g9xzr\" (UID: \"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439848 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe2e7665-098b-4338-9ff3-f936514ebbb9-audit-dir\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439890 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fe2e7665-098b-4338-9ff3-f936514ebbb9-etcd-client\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439924 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60af193a-2553-4f45-b190-c86e1e3594e1-registry-certificates\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439967 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-n9g9l\" (UID: \"37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.439989 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qzrk\" (UniqueName: \"kubernetes.io/projected/d065e112-052a-4e44-87f6-7713ebdfa2bd-kube-api-access-6qzrk\") pod \"dns-operator-744455d44c-6xlwc\" (UID: \"d065e112-052a-4e44-87f6-7713ebdfa2bd\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xlwc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.440197 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-registry-tls\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.440258 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe2e7665-098b-4338-9ff3-f936514ebbb9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.440353 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb494b13-9120-4ff9-8349-48568da9e990-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-g5279\" (UID: \"bb494b13-9120-4ff9-8349-48568da9e990\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.448203 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.540909 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541114 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9rs8\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-kube-api-access-h9rs8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541157 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmm87\" (UniqueName: \"kubernetes.io/projected/838dc182-e289-4769-98b0-e76ad62793c1-kube-api-access-zmm87\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541206 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brvc5\" (UniqueName: \"kubernetes.io/projected/34925a57-7fd9-4a0e-955c-cbc1ad264fed-kube-api-access-brvc5\") pod \"machine-config-operator-74547568cd-5drsv\" (UID: \"34925a57-7fd9-4a0e-955c-cbc1ad264fed\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541228 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/340f2b24-7f0e-4198-bb0c-6c4f50e4fac9-cert\") pod \"ingress-canary-xwr6f\" (UID: \"340f2b24-7f0e-4198-bb0c-6c4f50e4fac9\") " pod="openshift-ingress-canary/ingress-canary-xwr6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541271 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60af193a-2553-4f45-b190-c86e1e3594e1-trusted-ca\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541290 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rstc9\" (UniqueName: \"kubernetes.io/projected/37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b-kube-api-access-rstc9\") pod \"kube-storage-version-migrator-operator-b67b599dd-n9g9l\" (UID: \"37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541305 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb494b13-9120-4ff9-8349-48568da9e990-config\") pod \"kube-apiserver-operator-766d6c64bb-g5279\" (UID: \"bb494b13-9120-4ff9-8349-48568da9e990\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541320 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe2e7665-098b-4338-9ff3-f936514ebbb9-audit-policies\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541334 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmvkg\" (UniqueName: \"kubernetes.io/projected/fe2e7665-098b-4338-9ff3-f936514ebbb9-kube-api-access-cmvkg\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541349 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lvzm2\" (UID: \"1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541369 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/030a2c5c-27d3-4eb6-889c-1888b80e9eef-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tbpn9\" (UID: \"030a2c5c-27d3-4eb6-889c-1888b80e9eef\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tbpn9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541397 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qng8x\" (UID: \"b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541414 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-plugins-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541470 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnhp7\" (UniqueName: \"kubernetes.io/projected/33a94bd2-f479-403b-9c36-a708410864aa-kube-api-access-fnhp7\") pod \"auto-csr-approver-29547622-4796h\" (UID: \"33a94bd2-f479-403b-9c36-a708410864aa\") " pod="openshift-infra/auto-csr-approver-29547622-4796h" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541497 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8541392-5b56-4a5d-ae7b-fd68ffdc2a85-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-b5kjm\" (UID: \"c8541392-5b56-4a5d-ae7b-fd68ffdc2a85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541524 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-service-ca-bundle\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541542 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79a74909-eddf-4d5f-b43e-d6a790ff4d52-profile-collector-cert\") pod \"catalog-operator-68c6474976-9hbtn\" (UID: \"79a74909-eddf-4d5f-b43e-d6a790ff4d52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541568 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fe2e7665-098b-4338-9ff3-f936514ebbb9-encryption-config\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541584 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cscmx\" (UniqueName: \"kubernetes.io/projected/c8541392-5b56-4a5d-ae7b-fd68ffdc2a85-kube-api-access-cscmx\") pod \"machine-config-controller-84d6567774-b5kjm\" (UID: \"c8541392-5b56-4a5d-ae7b-fd68ffdc2a85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541599 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvnk5\" (UniqueName: \"kubernetes.io/projected/320d5766-4cb7-4818-9072-86bfe7e7279d-kube-api-access-pvnk5\") pod \"collect-profiles-29547615-6d5r5\" (UID: \"320d5766-4cb7-4818-9072-86bfe7e7279d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541618 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvg2b\" (UniqueName: \"kubernetes.io/projected/253d51b5-b44c-42ea-b259-aa9ff80888d6-kube-api-access-tvg2b\") pod \"service-ca-operator-777779d784-djkqv\" (UID: \"253d51b5-b44c-42ea-b259-aa9ff80888d6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541666 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/468005f5-e421-4e6e-950e-c5232f78adc8-config\") pod \"kube-controller-manager-operator-78b949d7b-4z4gl\" (UID: \"468005f5-e421-4e6e-950e-c5232f78adc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541700 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/320d5766-4cb7-4818-9072-86bfe7e7279d-secret-volume\") pod \"collect-profiles-29547615-6d5r5\" (UID: \"320d5766-4cb7-4818-9072-86bfe7e7279d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541716 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgn2t\" (UniqueName: \"kubernetes.io/projected/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-kube-api-access-rgn2t\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541735 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34925a57-7fd9-4a0e-955c-cbc1ad264fed-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5drsv\" (UID: \"34925a57-7fd9-4a0e-955c-cbc1ad264fed\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541752 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-n9g9l\" (UID: \"37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541768 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7e52c228-a5cf-4b90-a8bc-4926c2d58ec0-machine-approver-tls\") pod \"machine-approver-56656f9798-g9xzr\" (UID: \"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541785 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-registration-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541808 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fe2e7665-098b-4338-9ff3-f936514ebbb9-etcd-client\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541826 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-registry-tls\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541843 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60af193a-2553-4f45-b190-c86e1e3594e1-registry-certificates\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541861 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qzrk\" (UniqueName: \"kubernetes.io/projected/d065e112-052a-4e44-87f6-7713ebdfa2bd-kube-api-access-6qzrk\") pod \"dns-operator-744455d44c-6xlwc\" (UID: \"d065e112-052a-4e44-87f6-7713ebdfa2bd\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xlwc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541887 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe2e7665-098b-4338-9ff3-f936514ebbb9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541906 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp9jx\" (UniqueName: \"kubernetes.io/projected/340f2b24-7f0e-4198-bb0c-6c4f50e4fac9-kube-api-access-wp9jx\") pod \"ingress-canary-xwr6f\" (UID: \"340f2b24-7f0e-4198-bb0c-6c4f50e4fac9\") " pod="openshift-ingress-canary/ingress-canary-xwr6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541923 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76eea8f9-8567-496d-ac53-575a25a140de-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m4p5r\" (UID: \"76eea8f9-8567-496d-ac53-575a25a140de\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541941 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmltb\" (UniqueName: \"kubernetes.io/projected/79a74909-eddf-4d5f-b43e-d6a790ff4d52-kube-api-access-mmltb\") pod \"catalog-operator-68c6474976-9hbtn\" (UID: \"79a74909-eddf-4d5f-b43e-d6a790ff4d52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541958 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/34925a57-7fd9-4a0e-955c-cbc1ad264fed-images\") pod \"machine-config-operator-74547568cd-5drsv\" (UID: \"34925a57-7fd9-4a0e-955c-cbc1ad264fed\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541975 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e4bbf5e-dcd1-4e37-ab88-1ce0def71019-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n8r6f\" (UID: \"1e4bbf5e-dcd1-4e37-ab88-1ce0def71019\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.541992 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb494b13-9120-4ff9-8349-48568da9e990-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-g5279\" (UID: \"bb494b13-9120-4ff9-8349-48568da9e990\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542019 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m495j\" (UniqueName: \"kubernetes.io/projected/836ad923-c529-404d-82cb-6771c4932549-kube-api-access-m495j\") pod \"migrator-59844c95c7-4rlvc\" (UID: \"836ad923-c529-404d-82cb-6771c4932549\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4rlvc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542037 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fcaa0a81-da24-4346-b670-7ad3a516d8f6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7vcmc\" (UID: \"fcaa0a81-da24-4346-b670-7ad3a516d8f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542054 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dce28ba1-7f97-47ad-8ba4-0b6a396e3d54-metrics-tls\") pod \"ingress-operator-5b745b69d9-4b8ff\" (UID: \"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542113 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60af193a-2553-4f45-b190-c86e1e3594e1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542129 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/468005f5-e421-4e6e-950e-c5232f78adc8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4z4gl\" (UID: \"468005f5-e421-4e6e-950e-c5232f78adc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542185 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npnd9\" (UniqueName: \"kubernetes.io/projected/76eea8f9-8567-496d-ac53-575a25a140de-kube-api-access-npnd9\") pod \"marketplace-operator-79b997595-m4p5r\" (UID: \"76eea8f9-8567-496d-ac53-575a25a140de\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542212 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9053087d-d7b8-4835-a6f0-2e0bd3d16388-config-volume\") pod \"dns-default-8ls5c\" (UID: \"9053087d-d7b8-4835-a6f0-2e0bd3d16388\") " pod="openshift-dns/dns-default-8ls5c" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542239 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14c0f499-79e0-4090-bfaa-3d8606e04925-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542259 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec0b40d-04d4-486b-93bc-361c72d74aad-config\") pod \"machine-api-operator-5694c8668f-8ggcp\" (UID: \"3ec0b40d-04d4-486b-93bc-361c72d74aad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542274 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/320d5766-4cb7-4818-9072-86bfe7e7279d-config-volume\") pod \"collect-profiles-29547615-6d5r5\" (UID: \"320d5766-4cb7-4818-9072-86bfe7e7279d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542292 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe2e7665-098b-4338-9ff3-f936514ebbb9-serving-cert\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542320 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/978fa00f-eb81-4333-8589-a484358f3f09-signing-key\") pod \"service-ca-9c57cc56f-zdmpn\" (UID: \"978fa00f-eb81-4333-8589-a484358f3f09\") " pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542345 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3ec0b40d-04d4-486b-93bc-361c72d74aad-images\") pod \"machine-api-operator-5694c8668f-8ggcp\" (UID: \"3ec0b40d-04d4-486b-93bc-361c72d74aad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542364 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e52c228-a5cf-4b90-a8bc-4926c2d58ec0-config\") pod \"machine-approver-56656f9798-g9xzr\" (UID: \"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542379 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e-srv-cert\") pod \"olm-operator-6b444d44fb-lvzm2\" (UID: \"1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542406 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-csi-data-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542425 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-socket-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542452 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/337ffd12-61ea-489d-94a6-4424e7eae3af-apiservice-cert\") pod \"packageserver-d55dfcdfc-sbfsm\" (UID: \"337ffd12-61ea-489d-94a6-4424e7eae3af\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542470 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pstgz\" (UniqueName: \"kubernetes.io/projected/076fb655-c00f-4613-9c9a-5635aa6d3ddf-kube-api-access-pstgz\") pod \"machine-config-server-q6whv\" (UID: \"076fb655-c00f-4613-9c9a-5635aa6d3ddf\") " pod="openshift-machine-config-operator/machine-config-server-q6whv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542488 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hztpp\" (UniqueName: \"kubernetes.io/projected/7e52c228-a5cf-4b90-a8bc-4926c2d58ec0-kube-api-access-hztpp\") pod \"machine-approver-56656f9798-g9xzr\" (UID: \"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542528 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w2lb\" (UniqueName: \"kubernetes.io/projected/423b5174-7bed-4fba-af44-51abd9188676-kube-api-access-8w2lb\") pod \"downloads-7954f5f757-nnnmk\" (UID: \"423b5174-7bed-4fba-af44-51abd9188676\") " pod="openshift-console/downloads-7954f5f757-nnnmk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542545 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/468005f5-e421-4e6e-950e-c5232f78adc8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4z4gl\" (UID: \"468005f5-e421-4e6e-950e-c5232f78adc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542564 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e4bbf5e-dcd1-4e37-ab88-1ce0def71019-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n8r6f\" (UID: \"1e4bbf5e-dcd1-4e37-ab88-1ce0def71019\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542582 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/337ffd12-61ea-489d-94a6-4424e7eae3af-tmpfs\") pod \"packageserver-d55dfcdfc-sbfsm\" (UID: \"337ffd12-61ea-489d-94a6-4424e7eae3af\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542598 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvtn8\" (UniqueName: \"kubernetes.io/projected/dce28ba1-7f97-47ad-8ba4-0b6a396e3d54-kube-api-access-hvtn8\") pod \"ingress-operator-5b745b69d9-4b8ff\" (UID: \"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542672 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qng8x\" (UID: \"b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542694 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34925a57-7fd9-4a0e-955c-cbc1ad264fed-proxy-tls\") pod \"machine-config-operator-74547568cd-5drsv\" (UID: \"34925a57-7fd9-4a0e-955c-cbc1ad264fed\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542720 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-bound-sa-token\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542738 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z68gz\" (UniqueName: \"kubernetes.io/projected/3ec0b40d-04d4-486b-93bc-361c72d74aad-kube-api-access-z68gz\") pod \"machine-api-operator-5694c8668f-8ggcp\" (UID: \"3ec0b40d-04d4-486b-93bc-361c72d74aad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542765 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxgsz\" (UniqueName: \"kubernetes.io/projected/9053087d-d7b8-4835-a6f0-2e0bd3d16388-kube-api-access-fxgsz\") pod \"dns-default-8ls5c\" (UID: \"9053087d-d7b8-4835-a6f0-2e0bd3d16388\") " pod="openshift-dns/dns-default-8ls5c" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542782 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/076fb655-c00f-4613-9c9a-5635aa6d3ddf-node-bootstrap-token\") pod \"machine-config-server-q6whv\" (UID: \"076fb655-c00f-4613-9c9a-5635aa6d3ddf\") " pod="openshift-machine-config-operator/machine-config-server-q6whv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542799 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-stats-auth\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542816 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7qls\" (UniqueName: \"kubernetes.io/projected/14c0f499-79e0-4090-bfaa-3d8606e04925-kube-api-access-m7qls\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542846 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/978fa00f-eb81-4333-8589-a484358f3f09-signing-cabundle\") pod \"service-ca-9c57cc56f-zdmpn\" (UID: \"978fa00f-eb81-4333-8589-a484358f3f09\") " pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542864 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/253d51b5-b44c-42ea-b259-aa9ff80888d6-serving-cert\") pod \"service-ca-operator-777779d784-djkqv\" (UID: \"253d51b5-b44c-42ea-b259-aa9ff80888d6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542880 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfvx6\" (UniqueName: \"kubernetes.io/projected/030a2c5c-27d3-4eb6-889c-1888b80e9eef-kube-api-access-wfvx6\") pod \"multus-admission-controller-857f4d67dd-tbpn9\" (UID: \"030a2c5c-27d3-4eb6-889c-1888b80e9eef\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tbpn9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542907 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14c0f499-79e0-4090-bfaa-3d8606e04925-service-ca-bundle\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542923 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/076fb655-c00f-4613-9c9a-5635aa6d3ddf-certs\") pod \"machine-config-server-q6whv\" (UID: \"076fb655-c00f-4613-9c9a-5635aa6d3ddf\") " pod="openshift-machine-config-operator/machine-config-server-q6whv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542942 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee57bd24-197d-4722-9a1a-a73e914a0973-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wblnn\" (UID: \"ee57bd24-197d-4722-9a1a-a73e914a0973\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542961 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e52c228-a5cf-4b90-a8bc-4926c2d58ec0-auth-proxy-config\") pod \"machine-approver-56656f9798-g9xzr\" (UID: \"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.542977 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq28m\" (UniqueName: \"kubernetes.io/projected/337ffd12-61ea-489d-94a6-4424e7eae3af-kube-api-access-kq28m\") pod \"packageserver-d55dfcdfc-sbfsm\" (UID: \"337ffd12-61ea-489d-94a6-4424e7eae3af\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543020 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe2e7665-098b-4338-9ff3-f936514ebbb9-audit-dir\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543039 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/253d51b5-b44c-42ea-b259-aa9ff80888d6-config\") pod \"service-ca-operator-777779d784-djkqv\" (UID: \"253d51b5-b44c-42ea-b259-aa9ff80888d6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543067 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-n9g9l\" (UID: \"37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543092 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb494b13-9120-4ff9-8349-48568da9e990-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-g5279\" (UID: \"bb494b13-9120-4ff9-8349-48568da9e990\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543108 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9053087d-d7b8-4835-a6f0-2e0bd3d16388-metrics-tls\") pod \"dns-default-8ls5c\" (UID: \"9053087d-d7b8-4835-a6f0-2e0bd3d16388\") " pod="openshift-dns/dns-default-8ls5c" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543135 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79a74909-eddf-4d5f-b43e-d6a790ff4d52-srv-cert\") pod \"catalog-operator-68c6474976-9hbtn\" (UID: \"79a74909-eddf-4d5f-b43e-d6a790ff4d52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543152 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a55cc042-6fa1-45a3-be75-9eb886b29a5a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-s654w\" (UID: \"a55cc042-6fa1-45a3-be75-9eb886b29a5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543183 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8541392-5b56-4a5d-ae7b-fd68ffdc2a85-proxy-tls\") pod \"machine-config-controller-84d6567774-b5kjm\" (UID: \"c8541392-5b56-4a5d-ae7b-fd68ffdc2a85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543202 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c0f499-79e0-4090-bfaa-3d8606e04925-config\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543220 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/337ffd12-61ea-489d-94a6-4424e7eae3af-webhook-cert\") pod \"packageserver-d55dfcdfc-sbfsm\" (UID: \"337ffd12-61ea-489d-94a6-4424e7eae3af\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543236 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7m96\" (UniqueName: \"kubernetes.io/projected/978fa00f-eb81-4333-8589-a484358f3f09-kube-api-access-c7m96\") pod \"service-ca-9c57cc56f-zdmpn\" (UID: \"978fa00f-eb81-4333-8589-a484358f3f09\") " pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543252 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-default-certificate\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543267 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/76eea8f9-8567-496d-ac53-575a25a140de-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m4p5r\" (UID: \"76eea8f9-8567-496d-ac53-575a25a140de\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543304 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fe2e7665-098b-4338-9ff3-f936514ebbb9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543321 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-mountpoint-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543340 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5t54\" (UniqueName: \"kubernetes.io/projected/fcaa0a81-da24-4346-b670-7ad3a516d8f6-kube-api-access-s5t54\") pod \"cluster-samples-operator-665b6dd947-7vcmc\" (UID: \"fcaa0a81-da24-4346-b670-7ad3a516d8f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543357 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dce28ba1-7f97-47ad-8ba4-0b6a396e3d54-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4b8ff\" (UID: \"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543433 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60af193a-2553-4f45-b190-c86e1e3594e1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543451 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dce28ba1-7f97-47ad-8ba4-0b6a396e3d54-trusted-ca\") pod \"ingress-operator-5b745b69d9-4b8ff\" (UID: \"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543467 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qtw\" (UniqueName: \"kubernetes.io/projected/ee57bd24-197d-4722-9a1a-a73e914a0973-kube-api-access-m6qtw\") pod \"control-plane-machine-set-operator-78cbb6b69f-wblnn\" (UID: \"ee57bd24-197d-4722-9a1a-a73e914a0973\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543495 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krwtj\" (UniqueName: \"kubernetes.io/projected/b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2-kube-api-access-krwtj\") pod \"openshift-controller-manager-operator-756b6f6bc6-qng8x\" (UID: \"b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543511 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3ec0b40d-04d4-486b-93bc-361c72d74aad-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8ggcp\" (UID: \"3ec0b40d-04d4-486b-93bc-361c72d74aad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543546 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d065e112-052a-4e44-87f6-7713ebdfa2bd-metrics-tls\") pod \"dns-operator-744455d44c-6xlwc\" (UID: \"d065e112-052a-4e44-87f6-7713ebdfa2bd\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xlwc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543574 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14c0f499-79e0-4090-bfaa-3d8606e04925-serving-cert\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543592 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2hdm\" (UniqueName: \"kubernetes.io/projected/1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e-kube-api-access-z2hdm\") pod \"olm-operator-6b444d44fb-lvzm2\" (UID: \"1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543608 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpzxh\" (UniqueName: \"kubernetes.io/projected/a55cc042-6fa1-45a3-be75-9eb886b29a5a-kube-api-access-kpzxh\") pod \"package-server-manager-789f6589d5-s654w\" (UID: \"a55cc042-6fa1-45a3-be75-9eb886b29a5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543626 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e4bbf5e-dcd1-4e37-ab88-1ce0def71019-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n8r6f\" (UID: \"1e4bbf5e-dcd1-4e37-ab88-1ce0def71019\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.543641 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-metrics-certs\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: E0307 04:22:59.543752 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.043737733 +0000 UTC m=+225.090121222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.548396 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60af193a-2553-4f45-b190-c86e1e3594e1-trusted-ca\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.548961 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb494b13-9120-4ff9-8349-48568da9e990-config\") pod \"kube-apiserver-operator-766d6c64bb-g5279\" (UID: \"bb494b13-9120-4ff9-8349-48568da9e990\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.550144 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe2e7665-098b-4338-9ff3-f936514ebbb9-audit-policies\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.550426 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe2e7665-098b-4338-9ff3-f936514ebbb9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.551526 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60af193a-2553-4f45-b190-c86e1e3594e1-registry-certificates\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.553062 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14c0f499-79e0-4090-bfaa-3d8606e04925-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.553555 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e4bbf5e-dcd1-4e37-ab88-1ce0def71019-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n8r6f\" (UID: \"1e4bbf5e-dcd1-4e37-ab88-1ce0def71019\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.556255 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec0b40d-04d4-486b-93bc-361c72d74aad-config\") pod \"machine-api-operator-5694c8668f-8ggcp\" (UID: \"3ec0b40d-04d4-486b-93bc-361c72d74aad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.556457 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60af193a-2553-4f45-b190-c86e1e3594e1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.557380 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c0f499-79e0-4090-bfaa-3d8606e04925-config\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.559647 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e52c228-a5cf-4b90-a8bc-4926c2d58ec0-auth-proxy-config\") pod \"machine-approver-56656f9798-g9xzr\" (UID: \"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.562627 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb494b13-9120-4ff9-8349-48568da9e990-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-g5279\" (UID: \"bb494b13-9120-4ff9-8349-48568da9e990\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.563353 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e52c228-a5cf-4b90-a8bc-4926c2d58ec0-config\") pod \"machine-approver-56656f9798-g9xzr\" (UID: \"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.563592 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fe2e7665-098b-4338-9ff3-f936514ebbb9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.563917 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d065e112-052a-4e44-87f6-7713ebdfa2bd-metrics-tls\") pod \"dns-operator-744455d44c-6xlwc\" (UID: \"d065e112-052a-4e44-87f6-7713ebdfa2bd\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xlwc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.564696 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-n9g9l\" (UID: \"37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.564717 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3ec0b40d-04d4-486b-93bc-361c72d74aad-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8ggcp\" (UID: \"3ec0b40d-04d4-486b-93bc-361c72d74aad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.564764 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3ec0b40d-04d4-486b-93bc-361c72d74aad-images\") pod \"machine-api-operator-5694c8668f-8ggcp\" (UID: \"3ec0b40d-04d4-486b-93bc-361c72d74aad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.570964 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-n9g9l\" (UID: \"37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.571046 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fe2e7665-098b-4338-9ff3-f936514ebbb9-etcd-client\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.571302 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe2e7665-098b-4338-9ff3-f936514ebbb9-audit-dir\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.571623 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe2e7665-098b-4338-9ff3-f936514ebbb9-serving-cert\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.572271 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14c0f499-79e0-4090-bfaa-3d8606e04925-serving-cert\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.572539 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qng8x\" (UID: \"b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.572914 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14c0f499-79e0-4090-bfaa-3d8606e04925-service-ca-bundle\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.573018 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e4bbf5e-dcd1-4e37-ab88-1ce0def71019-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n8r6f\" (UID: \"1e4bbf5e-dcd1-4e37-ab88-1ce0def71019\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.573436 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qng8x\" (UID: \"b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.575117 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-registry-tls\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.576780 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7e52c228-a5cf-4b90-a8bc-4926c2d58ec0-machine-approver-tls\") pod \"machine-approver-56656f9798-g9xzr\" (UID: \"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.577528 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.578666 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60af193a-2553-4f45-b190-c86e1e3594e1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.579888 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fe2e7665-098b-4338-9ff3-f936514ebbb9-encryption-config\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.596190 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9rs8\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-kube-api-access-h9rs8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.606888 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rstc9\" (UniqueName: \"kubernetes.io/projected/37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b-kube-api-access-rstc9\") pod \"kube-storage-version-migrator-operator-b67b599dd-n9g9l\" (UID: \"37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.619484 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qzrk\" (UniqueName: \"kubernetes.io/projected/d065e112-052a-4e44-87f6-7713ebdfa2bd-kube-api-access-6qzrk\") pod \"dns-operator-744455d44c-6xlwc\" (UID: \"d065e112-052a-4e44-87f6-7713ebdfa2bd\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xlwc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.634759 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hztpp\" (UniqueName: \"kubernetes.io/projected/7e52c228-a5cf-4b90-a8bc-4926c2d58ec0-kube-api-access-hztpp\") pod \"machine-approver-56656f9798-g9xzr\" (UID: \"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.636735 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645405 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-service-ca-bundle\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645480 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79a74909-eddf-4d5f-b43e-d6a790ff4d52-profile-collector-cert\") pod \"catalog-operator-68c6474976-9hbtn\" (UID: \"79a74909-eddf-4d5f-b43e-d6a790ff4d52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645540 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cscmx\" (UniqueName: \"kubernetes.io/projected/c8541392-5b56-4a5d-ae7b-fd68ffdc2a85-kube-api-access-cscmx\") pod \"machine-config-controller-84d6567774-b5kjm\" (UID: \"c8541392-5b56-4a5d-ae7b-fd68ffdc2a85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645561 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvnk5\" (UniqueName: \"kubernetes.io/projected/320d5766-4cb7-4818-9072-86bfe7e7279d-kube-api-access-pvnk5\") pod \"collect-profiles-29547615-6d5r5\" (UID: \"320d5766-4cb7-4818-9072-86bfe7e7279d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645611 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvg2b\" (UniqueName: \"kubernetes.io/projected/253d51b5-b44c-42ea-b259-aa9ff80888d6-kube-api-access-tvg2b\") pod \"service-ca-operator-777779d784-djkqv\" (UID: \"253d51b5-b44c-42ea-b259-aa9ff80888d6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645638 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/320d5766-4cb7-4818-9072-86bfe7e7279d-secret-volume\") pod \"collect-profiles-29547615-6d5r5\" (UID: \"320d5766-4cb7-4818-9072-86bfe7e7279d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645656 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/468005f5-e421-4e6e-950e-c5232f78adc8-config\") pod \"kube-controller-manager-operator-78b949d7b-4z4gl\" (UID: \"468005f5-e421-4e6e-950e-c5232f78adc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645697 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgn2t\" (UniqueName: \"kubernetes.io/projected/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-kube-api-access-rgn2t\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645716 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34925a57-7fd9-4a0e-955c-cbc1ad264fed-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5drsv\" (UID: \"34925a57-7fd9-4a0e-955c-cbc1ad264fed\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645736 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-registration-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645778 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp9jx\" (UniqueName: \"kubernetes.io/projected/340f2b24-7f0e-4198-bb0c-6c4f50e4fac9-kube-api-access-wp9jx\") pod \"ingress-canary-xwr6f\" (UID: \"340f2b24-7f0e-4198-bb0c-6c4f50e4fac9\") " pod="openshift-ingress-canary/ingress-canary-xwr6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645807 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76eea8f9-8567-496d-ac53-575a25a140de-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m4p5r\" (UID: \"76eea8f9-8567-496d-ac53-575a25a140de\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645846 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmltb\" (UniqueName: \"kubernetes.io/projected/79a74909-eddf-4d5f-b43e-d6a790ff4d52-kube-api-access-mmltb\") pod \"catalog-operator-68c6474976-9hbtn\" (UID: \"79a74909-eddf-4d5f-b43e-d6a790ff4d52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645865 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/34925a57-7fd9-4a0e-955c-cbc1ad264fed-images\") pod \"machine-config-operator-74547568cd-5drsv\" (UID: \"34925a57-7fd9-4a0e-955c-cbc1ad264fed\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645899 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m495j\" (UniqueName: \"kubernetes.io/projected/836ad923-c529-404d-82cb-6771c4932549-kube-api-access-m495j\") pod \"migrator-59844c95c7-4rlvc\" (UID: \"836ad923-c529-404d-82cb-6771c4932549\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4rlvc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645943 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fcaa0a81-da24-4346-b670-7ad3a516d8f6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7vcmc\" (UID: \"fcaa0a81-da24-4346-b670-7ad3a516d8f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645961 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dce28ba1-7f97-47ad-8ba4-0b6a396e3d54-metrics-tls\") pod \"ingress-operator-5b745b69d9-4b8ff\" (UID: \"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.645988 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/468005f5-e421-4e6e-950e-c5232f78adc8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4z4gl\" (UID: \"468005f5-e421-4e6e-950e-c5232f78adc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646042 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npnd9\" (UniqueName: \"kubernetes.io/projected/76eea8f9-8567-496d-ac53-575a25a140de-kube-api-access-npnd9\") pod \"marketplace-operator-79b997595-m4p5r\" (UID: \"76eea8f9-8567-496d-ac53-575a25a140de\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646071 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9053087d-d7b8-4835-a6f0-2e0bd3d16388-config-volume\") pod \"dns-default-8ls5c\" (UID: \"9053087d-d7b8-4835-a6f0-2e0bd3d16388\") " pod="openshift-dns/dns-default-8ls5c" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646093 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646114 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/320d5766-4cb7-4818-9072-86bfe7e7279d-config-volume\") pod \"collect-profiles-29547615-6d5r5\" (UID: \"320d5766-4cb7-4818-9072-86bfe7e7279d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646137 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/978fa00f-eb81-4333-8589-a484358f3f09-signing-key\") pod \"service-ca-9c57cc56f-zdmpn\" (UID: \"978fa00f-eb81-4333-8589-a484358f3f09\") " pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646197 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e-srv-cert\") pod \"olm-operator-6b444d44fb-lvzm2\" (UID: \"1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646219 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-csi-data-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646237 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-socket-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646256 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/337ffd12-61ea-489d-94a6-4424e7eae3af-apiservice-cert\") pod \"packageserver-d55dfcdfc-sbfsm\" (UID: \"337ffd12-61ea-489d-94a6-4424e7eae3af\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646273 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pstgz\" (UniqueName: \"kubernetes.io/projected/076fb655-c00f-4613-9c9a-5635aa6d3ddf-kube-api-access-pstgz\") pod \"machine-config-server-q6whv\" (UID: \"076fb655-c00f-4613-9c9a-5635aa6d3ddf\") " pod="openshift-machine-config-operator/machine-config-server-q6whv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646301 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/468005f5-e421-4e6e-950e-c5232f78adc8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4z4gl\" (UID: \"468005f5-e421-4e6e-950e-c5232f78adc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646327 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/337ffd12-61ea-489d-94a6-4424e7eae3af-tmpfs\") pod \"packageserver-d55dfcdfc-sbfsm\" (UID: \"337ffd12-61ea-489d-94a6-4424e7eae3af\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646344 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvtn8\" (UniqueName: \"kubernetes.io/projected/dce28ba1-7f97-47ad-8ba4-0b6a396e3d54-kube-api-access-hvtn8\") pod \"ingress-operator-5b745b69d9-4b8ff\" (UID: \"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646364 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34925a57-7fd9-4a0e-955c-cbc1ad264fed-proxy-tls\") pod \"machine-config-operator-74547568cd-5drsv\" (UID: \"34925a57-7fd9-4a0e-955c-cbc1ad264fed\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646426 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxgsz\" (UniqueName: \"kubernetes.io/projected/9053087d-d7b8-4835-a6f0-2e0bd3d16388-kube-api-access-fxgsz\") pod \"dns-default-8ls5c\" (UID: \"9053087d-d7b8-4835-a6f0-2e0bd3d16388\") " pod="openshift-dns/dns-default-8ls5c" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646446 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/076fb655-c00f-4613-9c9a-5635aa6d3ddf-node-bootstrap-token\") pod \"machine-config-server-q6whv\" (UID: \"076fb655-c00f-4613-9c9a-5635aa6d3ddf\") " pod="openshift-machine-config-operator/machine-config-server-q6whv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646465 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/978fa00f-eb81-4333-8589-a484358f3f09-signing-cabundle\") pod \"service-ca-9c57cc56f-zdmpn\" (UID: \"978fa00f-eb81-4333-8589-a484358f3f09\") " pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646482 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-stats-auth\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646509 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfvx6\" (UniqueName: \"kubernetes.io/projected/030a2c5c-27d3-4eb6-889c-1888b80e9eef-kube-api-access-wfvx6\") pod \"multus-admission-controller-857f4d67dd-tbpn9\" (UID: \"030a2c5c-27d3-4eb6-889c-1888b80e9eef\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tbpn9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646529 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/253d51b5-b44c-42ea-b259-aa9ff80888d6-serving-cert\") pod \"service-ca-operator-777779d784-djkqv\" (UID: \"253d51b5-b44c-42ea-b259-aa9ff80888d6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646545 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/076fb655-c00f-4613-9c9a-5635aa6d3ddf-certs\") pod \"machine-config-server-q6whv\" (UID: \"076fb655-c00f-4613-9c9a-5635aa6d3ddf\") " pod="openshift-machine-config-operator/machine-config-server-q6whv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646563 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee57bd24-197d-4722-9a1a-a73e914a0973-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wblnn\" (UID: \"ee57bd24-197d-4722-9a1a-a73e914a0973\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646584 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq28m\" (UniqueName: \"kubernetes.io/projected/337ffd12-61ea-489d-94a6-4424e7eae3af-kube-api-access-kq28m\") pod \"packageserver-d55dfcdfc-sbfsm\" (UID: \"337ffd12-61ea-489d-94a6-4424e7eae3af\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646602 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/253d51b5-b44c-42ea-b259-aa9ff80888d6-config\") pod \"service-ca-operator-777779d784-djkqv\" (UID: \"253d51b5-b44c-42ea-b259-aa9ff80888d6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646618 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9053087d-d7b8-4835-a6f0-2e0bd3d16388-metrics-tls\") pod \"dns-default-8ls5c\" (UID: \"9053087d-d7b8-4835-a6f0-2e0bd3d16388\") " pod="openshift-dns/dns-default-8ls5c" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646640 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8541392-5b56-4a5d-ae7b-fd68ffdc2a85-proxy-tls\") pod \"machine-config-controller-84d6567774-b5kjm\" (UID: \"c8541392-5b56-4a5d-ae7b-fd68ffdc2a85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646658 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79a74909-eddf-4d5f-b43e-d6a790ff4d52-srv-cert\") pod \"catalog-operator-68c6474976-9hbtn\" (UID: \"79a74909-eddf-4d5f-b43e-d6a790ff4d52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646678 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a55cc042-6fa1-45a3-be75-9eb886b29a5a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-s654w\" (UID: \"a55cc042-6fa1-45a3-be75-9eb886b29a5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646696 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/337ffd12-61ea-489d-94a6-4424e7eae3af-webhook-cert\") pod \"packageserver-d55dfcdfc-sbfsm\" (UID: \"337ffd12-61ea-489d-94a6-4424e7eae3af\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646721 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7m96\" (UniqueName: \"kubernetes.io/projected/978fa00f-eb81-4333-8589-a484358f3f09-kube-api-access-c7m96\") pod \"service-ca-9c57cc56f-zdmpn\" (UID: \"978fa00f-eb81-4333-8589-a484358f3f09\") " pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646738 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-default-certificate\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646775 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/76eea8f9-8567-496d-ac53-575a25a140de-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m4p5r\" (UID: \"76eea8f9-8567-496d-ac53-575a25a140de\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646805 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-mountpoint-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646828 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5t54\" (UniqueName: \"kubernetes.io/projected/fcaa0a81-da24-4346-b670-7ad3a516d8f6-kube-api-access-s5t54\") pod \"cluster-samples-operator-665b6dd947-7vcmc\" (UID: \"fcaa0a81-da24-4346-b670-7ad3a516d8f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646845 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dce28ba1-7f97-47ad-8ba4-0b6a396e3d54-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4b8ff\" (UID: \"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646866 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dce28ba1-7f97-47ad-8ba4-0b6a396e3d54-trusted-ca\") pod \"ingress-operator-5b745b69d9-4b8ff\" (UID: \"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646889 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qtw\" (UniqueName: \"kubernetes.io/projected/ee57bd24-197d-4722-9a1a-a73e914a0973-kube-api-access-m6qtw\") pod \"control-plane-machine-set-operator-78cbb6b69f-wblnn\" (UID: \"ee57bd24-197d-4722-9a1a-a73e914a0973\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646919 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-metrics-certs\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646935 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2hdm\" (UniqueName: \"kubernetes.io/projected/1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e-kube-api-access-z2hdm\") pod \"olm-operator-6b444d44fb-lvzm2\" (UID: \"1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646952 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpzxh\" (UniqueName: \"kubernetes.io/projected/a55cc042-6fa1-45a3-be75-9eb886b29a5a-kube-api-access-kpzxh\") pod \"package-server-manager-789f6589d5-s654w\" (UID: \"a55cc042-6fa1-45a3-be75-9eb886b29a5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646958 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/34925a57-7fd9-4a0e-955c-cbc1ad264fed-images\") pod \"machine-config-operator-74547568cd-5drsv\" (UID: \"34925a57-7fd9-4a0e-955c-cbc1ad264fed\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646974 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmm87\" (UniqueName: \"kubernetes.io/projected/838dc182-e289-4769-98b0-e76ad62793c1-kube-api-access-zmm87\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.646993 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brvc5\" (UniqueName: \"kubernetes.io/projected/34925a57-7fd9-4a0e-955c-cbc1ad264fed-kube-api-access-brvc5\") pod \"machine-config-operator-74547568cd-5drsv\" (UID: \"34925a57-7fd9-4a0e-955c-cbc1ad264fed\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.647018 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/340f2b24-7f0e-4198-bb0c-6c4f50e4fac9-cert\") pod \"ingress-canary-xwr6f\" (UID: \"340f2b24-7f0e-4198-bb0c-6c4f50e4fac9\") " pod="openshift-ingress-canary/ingress-canary-xwr6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.647046 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lvzm2\" (UID: \"1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.647069 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/030a2c5c-27d3-4eb6-889c-1888b80e9eef-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tbpn9\" (UID: \"030a2c5c-27d3-4eb6-889c-1888b80e9eef\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tbpn9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.647085 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-plugins-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.647105 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnhp7\" (UniqueName: \"kubernetes.io/projected/33a94bd2-f479-403b-9c36-a708410864aa-kube-api-access-fnhp7\") pod \"auto-csr-approver-29547622-4796h\" (UID: \"33a94bd2-f479-403b-9c36-a708410864aa\") " pod="openshift-infra/auto-csr-approver-29547622-4796h" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.647123 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8541392-5b56-4a5d-ae7b-fd68ffdc2a85-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-b5kjm\" (UID: \"c8541392-5b56-4a5d-ae7b-fd68ffdc2a85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.647795 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-service-ca-bundle\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.647829 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/320d5766-4cb7-4818-9072-86bfe7e7279d-config-volume\") pod \"collect-profiles-29547615-6d5r5\" (UID: \"320d5766-4cb7-4818-9072-86bfe7e7279d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.647975 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-csi-data-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.648032 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8541392-5b56-4a5d-ae7b-fd68ffdc2a85-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-b5kjm\" (UID: \"c8541392-5b56-4a5d-ae7b-fd68ffdc2a85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.648414 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9053087d-d7b8-4835-a6f0-2e0bd3d16388-config-volume\") pod \"dns-default-8ls5c\" (UID: \"9053087d-d7b8-4835-a6f0-2e0bd3d16388\") " pod="openshift-dns/dns-default-8ls5c" Mar 07 04:22:59 crc kubenswrapper[4689]: E0307 04:22:59.648707 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.148689314 +0000 UTC m=+225.195073003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.649537 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fcaa0a81-da24-4346-b670-7ad3a516d8f6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7vcmc\" (UID: \"fcaa0a81-da24-4346-b670-7ad3a516d8f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.651444 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79a74909-eddf-4d5f-b43e-d6a790ff4d52-profile-collector-cert\") pod \"catalog-operator-68c6474976-9hbtn\" (UID: \"79a74909-eddf-4d5f-b43e-d6a790ff4d52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.651834 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/468005f5-e421-4e6e-950e-c5232f78adc8-config\") pod \"kube-controller-manager-operator-78b949d7b-4z4gl\" (UID: \"468005f5-e421-4e6e-950e-c5232f78adc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.652509 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-metrics-certs\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.652841 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/337ffd12-61ea-489d-94a6-4424e7eae3af-tmpfs\") pod \"packageserver-d55dfcdfc-sbfsm\" (UID: \"337ffd12-61ea-489d-94a6-4424e7eae3af\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.653035 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/320d5766-4cb7-4818-9072-86bfe7e7279d-secret-volume\") pod \"collect-profiles-29547615-6d5r5\" (UID: \"320d5766-4cb7-4818-9072-86bfe7e7279d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.653220 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34925a57-7fd9-4a0e-955c-cbc1ad264fed-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5drsv\" (UID: \"34925a57-7fd9-4a0e-955c-cbc1ad264fed\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.653530 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-registration-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.653936 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/978fa00f-eb81-4333-8589-a484358f3f09-signing-cabundle\") pod \"service-ca-9c57cc56f-zdmpn\" (UID: \"978fa00f-eb81-4333-8589-a484358f3f09\") " pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.653984 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/253d51b5-b44c-42ea-b259-aa9ff80888d6-serving-cert\") pod \"service-ca-operator-777779d784-djkqv\" (UID: \"253d51b5-b44c-42ea-b259-aa9ff80888d6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.654190 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-socket-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.655065 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76eea8f9-8567-496d-ac53-575a25a140de-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m4p5r\" (UID: \"76eea8f9-8567-496d-ac53-575a25a140de\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.655359 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-mountpoint-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.655424 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/337ffd12-61ea-489d-94a6-4424e7eae3af-apiservice-cert\") pod \"packageserver-d55dfcdfc-sbfsm\" (UID: \"337ffd12-61ea-489d-94a6-4424e7eae3af\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.655930 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/253d51b5-b44c-42ea-b259-aa9ff80888d6-config\") pod \"service-ca-operator-777779d784-djkqv\" (UID: \"253d51b5-b44c-42ea-b259-aa9ff80888d6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.656386 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/838dc182-e289-4769-98b0-e76ad62793c1-plugins-dir\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.656691 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34925a57-7fd9-4a0e-955c-cbc1ad264fed-proxy-tls\") pod \"machine-config-operator-74547568cd-5drsv\" (UID: \"34925a57-7fd9-4a0e-955c-cbc1ad264fed\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.657203 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dce28ba1-7f97-47ad-8ba4-0b6a396e3d54-trusted-ca\") pod \"ingress-operator-5b745b69d9-4b8ff\" (UID: \"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.657456 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/076fb655-c00f-4613-9c9a-5635aa6d3ddf-node-bootstrap-token\") pod \"machine-config-server-q6whv\" (UID: \"076fb655-c00f-4613-9c9a-5635aa6d3ddf\") " pod="openshift-machine-config-operator/machine-config-server-q6whv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.657495 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/340f2b24-7f0e-4198-bb0c-6c4f50e4fac9-cert\") pod \"ingress-canary-xwr6f\" (UID: \"340f2b24-7f0e-4198-bb0c-6c4f50e4fac9\") " pod="openshift-ingress-canary/ingress-canary-xwr6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.657788 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e-srv-cert\") pod \"olm-operator-6b444d44fb-lvzm2\" (UID: \"1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.658442 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/76eea8f9-8567-496d-ac53-575a25a140de-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m4p5r\" (UID: \"76eea8f9-8567-496d-ac53-575a25a140de\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.658490 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-stats-auth\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.658499 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dce28ba1-7f97-47ad-8ba4-0b6a396e3d54-metrics-tls\") pod \"ingress-operator-5b745b69d9-4b8ff\" (UID: \"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.659324 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/337ffd12-61ea-489d-94a6-4424e7eae3af-webhook-cert\") pod \"packageserver-d55dfcdfc-sbfsm\" (UID: \"337ffd12-61ea-489d-94a6-4424e7eae3af\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.659435 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/468005f5-e421-4e6e-950e-c5232f78adc8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4z4gl\" (UID: \"468005f5-e421-4e6e-950e-c5232f78adc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.660671 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/076fb655-c00f-4613-9c9a-5635aa6d3ddf-certs\") pod \"machine-config-server-q6whv\" (UID: \"076fb655-c00f-4613-9c9a-5635aa6d3ddf\") " pod="openshift-machine-config-operator/machine-config-server-q6whv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.660814 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/978fa00f-eb81-4333-8589-a484358f3f09-signing-key\") pod \"service-ca-9c57cc56f-zdmpn\" (UID: \"978fa00f-eb81-4333-8589-a484358f3f09\") " pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.660891 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a55cc042-6fa1-45a3-be75-9eb886b29a5a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-s654w\" (UID: \"a55cc042-6fa1-45a3-be75-9eb886b29a5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.660922 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lvzm2\" (UID: \"1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.661096 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9053087d-d7b8-4835-a6f0-2e0bd3d16388-metrics-tls\") pod \"dns-default-8ls5c\" (UID: \"9053087d-d7b8-4835-a6f0-2e0bd3d16388\") " pod="openshift-dns/dns-default-8ls5c" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.661549 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8541392-5b56-4a5d-ae7b-fd68ffdc2a85-proxy-tls\") pod \"machine-config-controller-84d6567774-b5kjm\" (UID: \"c8541392-5b56-4a5d-ae7b-fd68ffdc2a85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.662775 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79a74909-eddf-4d5f-b43e-d6a790ff4d52-srv-cert\") pod \"catalog-operator-68c6474976-9hbtn\" (UID: \"79a74909-eddf-4d5f-b43e-d6a790ff4d52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.664473 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/030a2c5c-27d3-4eb6-889c-1888b80e9eef-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tbpn9\" (UID: \"030a2c5c-27d3-4eb6-889c-1888b80e9eef\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tbpn9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.671644 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-default-certificate\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.671656 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee57bd24-197d-4722-9a1a-a73e914a0973-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wblnn\" (UID: \"ee57bd24-197d-4722-9a1a-a73e914a0973\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.675767 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmvkg\" (UniqueName: \"kubernetes.io/projected/fe2e7665-098b-4338-9ff3-f936514ebbb9-kube-api-access-cmvkg\") pod \"apiserver-7bbb656c7d-4k6lm\" (UID: \"fe2e7665-098b-4338-9ff3-f936514ebbb9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.677217 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6xlwc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.677726 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w2lb\" (UniqueName: \"kubernetes.io/projected/423b5174-7bed-4fba-af44-51abd9188676-kube-api-access-8w2lb\") pod \"downloads-7954f5f757-nnnmk\" (UID: \"423b5174-7bed-4fba-af44-51abd9188676\") " pod="openshift-console/downloads-7954f5f757-nnnmk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.704915 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krwtj\" (UniqueName: \"kubernetes.io/projected/b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2-kube-api-access-krwtj\") pod \"openshift-controller-manager-operator-756b6f6bc6-qng8x\" (UID: \"b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.737013 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e4bbf5e-dcd1-4e37-ab88-1ce0def71019-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n8r6f\" (UID: \"1e4bbf5e-dcd1-4e37-ab88-1ce0def71019\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.743227 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.750627 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:22:59 crc kubenswrapper[4689]: E0307 04:22:59.750865 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.250818292 +0000 UTC m=+225.297201781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.751763 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: E0307 04:22:59.752130 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.252110296 +0000 UTC m=+225.298493785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.762840 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-bound-sa-token\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.776336 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z68gz\" (UniqueName: \"kubernetes.io/projected/3ec0b40d-04d4-486b-93bc-361c72d74aad-kube-api-access-z68gz\") pod \"machine-api-operator-5694c8668f-8ggcp\" (UID: \"3ec0b40d-04d4-486b-93bc-361c72d74aad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.796986 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7qls\" (UniqueName: \"kubernetes.io/projected/14c0f499-79e0-4090-bfaa-3d8606e04925-kube-api-access-m7qls\") pod \"authentication-operator-69f744f599-zw6mx\" (UID: \"14c0f499-79e0-4090-bfaa-3d8606e04925\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.823572 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb494b13-9120-4ff9-8349-48568da9e990-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-g5279\" (UID: \"bb494b13-9120-4ff9-8349-48568da9e990\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.843788 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/468005f5-e421-4e6e-950e-c5232f78adc8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4z4gl\" (UID: \"468005f5-e421-4e6e-950e-c5232f78adc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" Mar 07 04:22:59 crc kubenswrapper[4689]: E0307 04:22:59.853678 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.353649457 +0000 UTC m=+225.400032946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.853530 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.854501 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: E0307 04:22:59.854997 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.354974492 +0000 UTC m=+225.401357981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.858946 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m495j\" (UniqueName: \"kubernetes.io/projected/836ad923-c529-404d-82cb-6771c4932549-kube-api-access-m495j\") pod \"migrator-59844c95c7-4rlvc\" (UID: \"836ad923-c529-404d-82cb-6771c4932549\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4rlvc" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.860007 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h6hq2"] Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.887628 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.891274 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npnd9\" (UniqueName: \"kubernetes.io/projected/76eea8f9-8567-496d-ac53-575a25a140de-kube-api-access-npnd9\") pod \"marketplace-operator-79b997595-m4p5r\" (UID: \"76eea8f9-8567-496d-ac53-575a25a140de\") " pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.891608 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.905198 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pstgz\" (UniqueName: \"kubernetes.io/projected/076fb655-c00f-4613-9c9a-5635aa6d3ddf-kube-api-access-pstgz\") pod \"machine-config-server-q6whv\" (UID: \"076fb655-c00f-4613-9c9a-5635aa6d3ddf\") " pod="openshift-machine-config-operator/machine-config-server-q6whv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.906430 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7"] Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.907734 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-prpp8"] Mar 07 04:22:59 crc kubenswrapper[4689]: W0307 04:22:59.907988 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f06b111_b994_4bb2_b1f3_1033b5cde4aa.slice/crio-3a9856edcb4fb03be493a91edab45a5577bd5f84581b4cf3ecd7239beba0692c WatchSource:0}: Error finding container 3a9856edcb4fb03be493a91edab45a5577bd5f84581b4cf3ecd7239beba0692c: Status 404 returned error can't find the container with id 3a9856edcb4fb03be493a91edab45a5577bd5f84581b4cf3ecd7239beba0692c Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.922267 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgn2t\" (UniqueName: \"kubernetes.io/projected/3f4cf0c7-db05-4fc8-b538-199d3d4a4824-kube-api-access-rgn2t\") pod \"router-default-5444994796-7dvxk\" (UID: \"3f4cf0c7-db05-4fc8-b538-199d3d4a4824\") " pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.945480 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cscmx\" (UniqueName: \"kubernetes.io/projected/c8541392-5b56-4a5d-ae7b-fd68ffdc2a85-kube-api-access-cscmx\") pod \"machine-config-controller-84d6567774-b5kjm\" (UID: \"c8541392-5b56-4a5d-ae7b-fd68ffdc2a85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.949130 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6xlwc"] Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.949290 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nnnmk" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.954504 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-q6whv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.955700 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:22:59 crc kubenswrapper[4689]: E0307 04:22:59.955954 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.455906758 +0000 UTC m=+225.502290247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.959323 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56"] Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.959376 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvnk5\" (UniqueName: \"kubernetes.io/projected/320d5766-4cb7-4818-9072-86bfe7e7279d-kube-api-access-pvnk5\") pod \"collect-profiles-29547615-6d5r5\" (UID: \"320d5766-4cb7-4818-9072-86bfe7e7279d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.960006 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-j4z8p"] Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.965370 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:22:59 crc kubenswrapper[4689]: E0307 04:22:59.965804 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.465790817 +0000 UTC m=+225.512174306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.971577 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.974571 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w89ns"] Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.975662 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvg2b\" (UniqueName: \"kubernetes.io/projected/253d51b5-b44c-42ea-b259-aa9ff80888d6-kube-api-access-tvg2b\") pod \"service-ca-operator-777779d784-djkqv\" (UID: \"253d51b5-b44c-42ea-b259-aa9ff80888d6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.990739 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.996130 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" event={"ID":"7d79bc2b-a849-4d82-bc59-197431e014db","Type":"ContainerStarted","Data":"673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd"} Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.996222 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" event={"ID":"7d79bc2b-a849-4d82-bc59-197431e014db","Type":"ContainerStarted","Data":"cfad0d99aa095414e88a7d6d2dd312e60da41fdd0c9b677431e98d15621fcc7e"} Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.996370 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:22:59 crc kubenswrapper[4689]: I0307 04:22:59.998807 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxgsz\" (UniqueName: \"kubernetes.io/projected/9053087d-d7b8-4835-a6f0-2e0bd3d16388-kube-api-access-fxgsz\") pod \"dns-default-8ls5c\" (UID: \"9053087d-d7b8-4835-a6f0-2e0bd3d16388\") " pod="openshift-dns/dns-default-8ls5c" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.003613 4689 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9x8l6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.003677 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" podUID="7d79bc2b-a849-4d82-bc59-197431e014db" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.003939 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.009747 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" event={"ID":"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0","Type":"ContainerStarted","Data":"90eb9145535bfb9301c8e2b9ee5545ad63e826861e94547e0b02c1b5d95b2862"} Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.024290 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7m96\" (UniqueName: \"kubernetes.io/projected/978fa00f-eb81-4333-8589-a484358f3f09-kube-api-access-c7m96\") pod \"service-ca-9c57cc56f-zdmpn\" (UID: \"978fa00f-eb81-4333-8589-a484358f3f09\") " pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.024847 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" event={"ID":"8f06b111-b994-4bb2-b1f3-1033b5cde4aa","Type":"ContainerStarted","Data":"3a9856edcb4fb03be493a91edab45a5577bd5f84581b4cf3ecd7239beba0692c"} Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.033389 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.038155 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" event={"ID":"774b5998-29de-4546-937e-b5d2ee0b27d4","Type":"ContainerStarted","Data":"7eabd730383efd37de9d2afa6030133afa3dbabfe3d3935ab05f271ad5dd35ae"} Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.050648 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvtn8\" (UniqueName: \"kubernetes.io/projected/dce28ba1-7f97-47ad-8ba4-0b6a396e3d54-kube-api-access-hvtn8\") pod \"ingress-operator-5b745b69d9-4b8ff\" (UID: \"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.064489 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.076787 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfvx6\" (UniqueName: \"kubernetes.io/projected/030a2c5c-27d3-4eb6-889c-1888b80e9eef-kube-api-access-wfvx6\") pod \"multus-admission-controller-857f4d67dd-tbpn9\" (UID: \"030a2c5c-27d3-4eb6-889c-1888b80e9eef\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tbpn9" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.077815 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:00 crc kubenswrapper[4689]: E0307 04:23:00.077960 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.577928137 +0000 UTC m=+225.624311636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.078251 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.078950 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp9jx\" (UniqueName: \"kubernetes.io/projected/340f2b24-7f0e-4198-bb0c-6c4f50e4fac9-kube-api-access-wp9jx\") pod \"ingress-canary-xwr6f\" (UID: \"340f2b24-7f0e-4198-bb0c-6c4f50e4fac9\") " pod="openshift-ingress-canary/ingress-canary-xwr6f" Mar 07 04:23:00 crc kubenswrapper[4689]: E0307 04:23:00.079347 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.579334664 +0000 UTC m=+225.625718153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.084405 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.096430 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.101077 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmltb\" (UniqueName: \"kubernetes.io/projected/79a74909-eddf-4d5f-b43e-d6a790ff4d52-kube-api-access-mmltb\") pod \"catalog-operator-68c6474976-9hbtn\" (UID: \"79a74909-eddf-4d5f-b43e-d6a790ff4d52\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.103343 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-tbpn9" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.111647 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.120404 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4rlvc" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.126261 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brvc5\" (UniqueName: \"kubernetes.io/projected/34925a57-7fd9-4a0e-955c-cbc1ad264fed-kube-api-access-brvc5\") pod \"machine-config-operator-74547568cd-5drsv\" (UID: \"34925a57-7fd9-4a0e-955c-cbc1ad264fed\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.127640 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.139036 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2hdm\" (UniqueName: \"kubernetes.io/projected/1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e-kube-api-access-z2hdm\") pod \"olm-operator-6b444d44fb-lvzm2\" (UID: \"1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.147515 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm"] Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.151509 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" event={"ID":"e4c3b676-f7ae-4659-a3f6-73dcc319bed8","Type":"ContainerStarted","Data":"e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3"} Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.152635 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.159355 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qtw\" (UniqueName: \"kubernetes.io/projected/ee57bd24-197d-4722-9a1a-a73e914a0973-kube-api-access-m6qtw\") pod \"control-plane-machine-set-operator-78cbb6b69f-wblnn\" (UID: \"ee57bd24-197d-4722-9a1a-a73e914a0973\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.161023 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-prpp8" event={"ID":"6b4a1ec5-fba3-4058-930e-96b000e4b052","Type":"ContainerStarted","Data":"24443c38543d1b8bbb61fee214b436204d9d7fafafe46b4274200fc9688ef205"} Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.165560 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" event={"ID":"72404506-4e6f-4494-a61a-2ac56bd6b123","Type":"ContainerStarted","Data":"b2a5184fad2797a1277ace0733d4cfa6b2d59acdaea8fc94bfb4c44aee2a42f9"} Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.167441 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.170390 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" event={"ID":"afea7082-9f6d-4c1f-a9be-ad1444e1459e","Type":"ContainerStarted","Data":"6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d"} Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.170435 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" event={"ID":"afea7082-9f6d-4c1f-a9be-ad1444e1459e","Type":"ContainerStarted","Data":"98e3c81c69dc17318e621cbd401420fb2cfae1b14a8ba8fb0b629a453883580b"} Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.171095 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.175709 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5t54\" (UniqueName: \"kubernetes.io/projected/fcaa0a81-da24-4346-b670-7ad3a516d8f6-kube-api-access-s5t54\") pod \"cluster-samples-operator-665b6dd947-7vcmc\" (UID: \"fcaa0a81-da24-4346-b670-7ad3a516d8f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.176335 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.179474 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.185938 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.187361 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:23:00 crc kubenswrapper[4689]: E0307 04:23:00.187793 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.687770956 +0000 UTC m=+225.734154445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.207034 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dce28ba1-7f97-47ad-8ba4-0b6a396e3d54-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4b8ff\" (UID: \"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.223112 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xwr6f" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.226477 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpzxh\" (UniqueName: \"kubernetes.io/projected/a55cc042-6fa1-45a3-be75-9eb886b29a5a-kube-api-access-kpzxh\") pod \"package-server-manager-789f6589d5-s654w\" (UID: \"a55cc042-6fa1-45a3-be75-9eb886b29a5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.235514 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.245784 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8ls5c" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.255578 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq28m\" (UniqueName: \"kubernetes.io/projected/337ffd12-61ea-489d-94a6-4424e7eae3af-kube-api-access-kq28m\") pod \"packageserver-d55dfcdfc-sbfsm\" (UID: \"337ffd12-61ea-489d-94a6-4424e7eae3af\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.270745 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l"] Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.277497 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmm87\" (UniqueName: \"kubernetes.io/projected/838dc182-e289-4769-98b0-e76ad62793c1-kube-api-access-zmm87\") pod \"csi-hostpathplugin-c6r5s\" (UID: \"838dc182-e289-4769-98b0-e76ad62793c1\") " pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.296049 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:00 crc kubenswrapper[4689]: E0307 04:23:00.296540 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.796522537 +0000 UTC m=+225.842906036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.337791 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnhp7\" (UniqueName: \"kubernetes.io/projected/33a94bd2-f479-403b-9c36-a708410864aa-kube-api-access-fnhp7\") pod \"auto-csr-approver-29547622-4796h\" (UID: \"33a94bd2-f479-403b-9c36-a708410864aa\") " pod="openshift-infra/auto-csr-approver-29547622-4796h" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.353840 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.379686 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.380157 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.396969 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:00 crc kubenswrapper[4689]: E0307 04:23:00.397656 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:00.897613707 +0000 UTC m=+225.943997206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.407520 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.438259 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.454333 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.458114 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.502036 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:00 crc kubenswrapper[4689]: E0307 04:23:00.502526 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.002513877 +0000 UTC m=+226.048897366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.512718 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547622-4796h" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.566534 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.602895 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:00 crc kubenswrapper[4689]: E0307 04:23:00.603030 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.103005282 +0000 UTC m=+226.149388771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.603213 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:00 crc kubenswrapper[4689]: E0307 04:23:00.603536 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.103526675 +0000 UTC m=+226.149910164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.642433 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nnnmk"] Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.692357 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8ggcp"] Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.704411 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:00 crc kubenswrapper[4689]: E0307 04:23:00.704795 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.204778009 +0000 UTC m=+226.251161498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.806766 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:00 crc kubenswrapper[4689]: E0307 04:23:00.808602 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.30858535 +0000 UTC m=+226.354968839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.809030 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" podStartSLOduration=162.809008532 podStartE2EDuration="2m42.809008532s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:00.795330253 +0000 UTC m=+225.841713742" watchObservedRunningTime="2026-03-07 04:23:00.809008532 +0000 UTC m=+225.855392021" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.809683 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x"] Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.827377 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f"] Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.908671 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" podStartSLOduration=162.908647964 podStartE2EDuration="2m42.908647964s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:00.90849199 +0000 UTC m=+225.954875509" watchObservedRunningTime="2026-03-07 04:23:00.908647964 +0000 UTC m=+225.955031453" Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.910120 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:00 crc kubenswrapper[4689]: E0307 04:23:00.911536 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.411519599 +0000 UTC m=+226.457903088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:00 crc kubenswrapper[4689]: I0307 04:23:00.911616 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:00 crc kubenswrapper[4689]: E0307 04:23:00.912183 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.412135345 +0000 UTC m=+226.458518984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.021108 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:01 crc kubenswrapper[4689]: E0307 04:23:01.022020 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.522003856 +0000 UTC m=+226.568387345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.123357 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:01 crc kubenswrapper[4689]: E0307 04:23:01.123741 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.623727762 +0000 UTC m=+226.670111251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.166503 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zw6mx"] Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.224389 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.224976 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nnnmk" event={"ID":"423b5174-7bed-4fba-af44-51abd9188676","Type":"ContainerStarted","Data":"7a4b0f2927f2d4d54576d4a3e39794d99605c6ad3d1a3d16b2f2bbf3ca6f60b6"} Mar 07 04:23:01 crc kubenswrapper[4689]: W0307 04:23:01.226142 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e4bbf5e_dcd1_4e37_ab88_1ce0def71019.slice/crio-60d155028d64a1bbf97a0469dbbe0eb21f3d4ae01a9bb5efcd282e25e9f766ac WatchSource:0}: Error finding container 60d155028d64a1bbf97a0469dbbe0eb21f3d4ae01a9bb5efcd282e25e9f766ac: Status 404 returned error can't find the container with id 60d155028d64a1bbf97a0469dbbe0eb21f3d4ae01a9bb5efcd282e25e9f766ac Mar 07 04:23:01 crc kubenswrapper[4689]: E0307 04:23:01.233134 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.73310403 +0000 UTC m=+226.779487519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.238947 4689 generic.go:334] "Generic (PLEG): container finished" podID="774b5998-29de-4546-937e-b5d2ee0b27d4" containerID="eeb3c443e1c2341f3b540e5e1c649a445823d2f42a03936bedfe81df13cda8e4" exitCode=0 Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.239003 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" event={"ID":"774b5998-29de-4546-937e-b5d2ee0b27d4","Type":"ContainerDied","Data":"eeb3c443e1c2341f3b540e5e1c649a445823d2f42a03936bedfe81df13cda8e4"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.254405 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-q6whv" event={"ID":"076fb655-c00f-4613-9c9a-5635aa6d3ddf","Type":"ContainerStarted","Data":"1e230bfe50975760205c58f1b7eb017ea8becb223e6a3da75d04e8364ce0d1b6"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.278265 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" event={"ID":"fe2e7665-098b-4338-9ff3-f936514ebbb9","Type":"ContainerStarted","Data":"6e03c06daf4e56aed7671a6363de1a016da549d1950e8acd511e3b4da6958d22"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.325495 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-prpp8" event={"ID":"6b4a1ec5-fba3-4058-930e-96b000e4b052","Type":"ContainerStarted","Data":"930a29fad19ba883ad766b1012aec38211a2777840565618edac111fa13921ce"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.332042 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.336991 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" event={"ID":"37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b","Type":"ContainerStarted","Data":"c1d7308b5832021de7cb421849aa9ec477ea1156de3dda3fe8adfbe75ebabc30"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.339067 4689 patch_prober.go:28] interesting pod/console-operator-58897d9998-prpp8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.339201 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-prpp8" podUID="6b4a1ec5-fba3-4058-930e-96b000e4b052" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.357867 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-j4z8p" event={"ID":"5d0f9cf7-c781-4964-a714-bcd780e88285","Type":"ContainerStarted","Data":"986681b40e8e612a945cf4e3e2ef4aa50f5f97970b957ce3030ffc62b8000aee"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.358116 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:01 crc kubenswrapper[4689]: E0307 04:23:01.358463 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.858446295 +0000 UTC m=+226.904829794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.367631 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" event={"ID":"3ec0b40d-04d4-486b-93bc-361c72d74aad","Type":"ContainerStarted","Data":"11f76659580d95eacb06fe1194682007c52f9b92282abb35457322159a32afd2"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.378873 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hzrqj" podStartSLOduration=163.37883801 podStartE2EDuration="2m43.37883801s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:01.373407838 +0000 UTC m=+226.419791347" watchObservedRunningTime="2026-03-07 04:23:01.37883801 +0000 UTC m=+226.425221499" Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.384683 4689 generic.go:334] "Generic (PLEG): container finished" podID="8f06b111-b994-4bb2-b1f3-1033b5cde4aa" containerID="cfe228066afcb8ef3874848272d6363c6d578354fd4e3a8acf08d46be7ab0620" exitCode=0 Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.384866 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" event={"ID":"8f06b111-b994-4bb2-b1f3-1033b5cde4aa","Type":"ContainerDied","Data":"cfe228066afcb8ef3874848272d6363c6d578354fd4e3a8acf08d46be7ab0620"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.389645 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279"] Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.390537 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6xlwc" event={"ID":"d065e112-052a-4e44-87f6-7713ebdfa2bd","Type":"ContainerStarted","Data":"cb3e451c8d374d3cdb777131e920357873a4580a102d5cf088b73af309ea3dea"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.390707 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6xlwc" event={"ID":"d065e112-052a-4e44-87f6-7713ebdfa2bd","Type":"ContainerStarted","Data":"c865feb7866eb3dd7f511cd5d509ff6d70f7374f95d27361f104e48c337fc7e6"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.394437 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zdmpn"] Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.409020 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" event={"ID":"479d9009-47b4-4a26-990e-30e757c7aa17","Type":"ContainerStarted","Data":"4658259acc04164f48c94ee473f41d2b0666843fe268897c9f480d087687ff42"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.409103 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" event={"ID":"479d9009-47b4-4a26-990e-30e757c7aa17","Type":"ContainerStarted","Data":"651edf08d2eba6b10488ecc48d9a134ecc16451bd86237a3245fc55f6b2e9c52"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.418002 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" event={"ID":"e244dd83-cd20-40f8-a639-3164577c7316","Type":"ContainerStarted","Data":"8fb25a02c2dcf59c0b6bcf9146d474fc09f2799072c8b2c96f589eba84e3df3e"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.421805 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7dvxk" event={"ID":"3f4cf0c7-db05-4fc8-b538-199d3d4a4824","Type":"ContainerStarted","Data":"f919e51533591109185af275f78723771973a69ecb288c69a7cb76787079b4ee"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.426478 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" event={"ID":"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0","Type":"ContainerStarted","Data":"9dde49578409bd390c8b730d8f241d7c7001cf00976eabce5be74a0e2b580873"} Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.468648 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:01 crc kubenswrapper[4689]: E0307 04:23:01.468742 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.968621144 +0000 UTC m=+227.015004633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.469139 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:01 crc kubenswrapper[4689]: E0307 04:23:01.469599 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:01.969585668 +0000 UTC m=+227.015969157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.502194 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" podStartSLOduration=162.502159093 podStartE2EDuration="2m42.502159093s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:01.500624053 +0000 UTC m=+226.547007542" watchObservedRunningTime="2026-03-07 04:23:01.502159093 +0000 UTC m=+226.548542582" Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.514155 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.580229 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:01 crc kubenswrapper[4689]: E0307 04:23:01.580434 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:02.080393764 +0000 UTC m=+227.126777253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.582092 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:01 crc kubenswrapper[4689]: E0307 04:23:01.587245 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:02.087224782 +0000 UTC m=+227.133608271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.646587 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn"] Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.744831 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:01 crc kubenswrapper[4689]: E0307 04:23:01.745437 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:02.245418149 +0000 UTC m=+227.291801638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.847606 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:01 crc kubenswrapper[4689]: E0307 04:23:01.847924 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:02.347912647 +0000 UTC m=+227.394296136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.948976 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:01 crc kubenswrapper[4689]: E0307 04:23:01.949331 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:02.449273373 +0000 UTC m=+227.495656862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.956284 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:01 crc kubenswrapper[4689]: E0307 04:23:01.956687 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:02.456668167 +0000 UTC m=+227.503051656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:01 crc kubenswrapper[4689]: I0307 04:23:01.977706 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-prpp8" podStartSLOduration=163.977685468 podStartE2EDuration="2m43.977685468s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:01.976083336 +0000 UTC m=+227.022466825" watchObservedRunningTime="2026-03-07 04:23:01.977685468 +0000 UTC m=+227.024068957" Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.056990 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.057502 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:02.55746457 +0000 UTC m=+227.603848059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.058541 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.059047 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:02.55903147 +0000 UTC m=+227.605414959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.093831 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xpp56" podStartSLOduration=164.093811482 podStartE2EDuration="2m44.093811482s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:02.068582661 +0000 UTC m=+227.114966150" watchObservedRunningTime="2026-03-07 04:23:02.093811482 +0000 UTC m=+227.140194971" Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.160439 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.160916 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:02.660894051 +0000 UTC m=+227.707277540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.262343 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.262889 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:02.762867804 +0000 UTC m=+227.809251293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.364066 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.365344 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:02.86532172 +0000 UTC m=+227.911705209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.437634 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-j4z8p" event={"ID":"5d0f9cf7-c781-4964-a714-bcd780e88285","Type":"ContainerStarted","Data":"8522e670ec3250b424a714b9fe3d3652d13118d4957e2884384d0df3c91d1084"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.453472 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-q6whv" event={"ID":"076fb655-c00f-4613-9c9a-5635aa6d3ddf","Type":"ContainerStarted","Data":"22a16a80ce459c1600626b86fe7a6f2328141ef8276b87045f7eb8d77ef2362d"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.467229 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.467603 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:02.967584231 +0000 UTC m=+228.013967720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.475561 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" event={"ID":"37d1ca19-1dab-4bd1-9ccc-c7f373f7b59b","Type":"ContainerStarted","Data":"a24eaf5973b2ca852851765c0a925c7653343a04832bd0998ea1e9f1f7a5919a"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.477059 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-j4z8p" podStartSLOduration=164.477030069 podStartE2EDuration="2m44.477030069s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:02.460889795 +0000 UTC m=+227.507273284" watchObservedRunningTime="2026-03-07 04:23:02.477030069 +0000 UTC m=+227.523413558" Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.482149 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" event={"ID":"978fa00f-eb81-4333-8589-a484358f3f09","Type":"ContainerStarted","Data":"c31bb6cf7bce27ca0dfde3f2d29fe952d0144a41208bfb6a6a938baa4bd4cc4d"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.486734 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" event={"ID":"7e52c228-a5cf-4b90-a8bc-4926c2d58ec0","Type":"ContainerStarted","Data":"324f28a730233e8046c38ffb9a826bb08c4372078f2ced50749634ff51d6c4a4"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.490909 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-q6whv" podStartSLOduration=5.490892902 podStartE2EDuration="5.490892902s" podCreationTimestamp="2026-03-07 04:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:02.48740633 +0000 UTC m=+227.533789809" watchObservedRunningTime="2026-03-07 04:23:02.490892902 +0000 UTC m=+227.537276391" Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.500326 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" event={"ID":"b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2","Type":"ContainerStarted","Data":"208e6b1fd6f2e35a5ea66709d138a19d958b915f1d9f4a2af0ab25e308af0cc7"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.500409 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" event={"ID":"b7e4b7fc-e8fc-4dcc-9998-b322f2c06ce2","Type":"ContainerStarted","Data":"9ce3e2efb1b6ac3ed24492a9496a86e2edc0dee56d088e5ff28d3116b99b8860"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.505237 4689 generic.go:334] "Generic (PLEG): container finished" podID="fe2e7665-098b-4338-9ff3-f936514ebbb9" containerID="3b735ea0117bc40c2d8c46156409dcd83cf544f88c8d3a8a74160742da20a32c" exitCode=0 Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.505365 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" event={"ID":"fe2e7665-098b-4338-9ff3-f936514ebbb9","Type":"ContainerDied","Data":"3b735ea0117bc40c2d8c46156409dcd83cf544f88c8d3a8a74160742da20a32c"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.506990 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g9xzr" podStartSLOduration=164.506974793 podStartE2EDuration="2m44.506974793s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:02.505286659 +0000 UTC m=+227.551670148" watchObservedRunningTime="2026-03-07 04:23:02.506974793 +0000 UTC m=+227.553358272" Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.510527 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" event={"ID":"1e4bbf5e-dcd1-4e37-ab88-1ce0def71019","Type":"ContainerStarted","Data":"60d155028d64a1bbf97a0469dbbe0eb21f3d4ae01a9bb5efcd282e25e9f766ac"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.511717 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" event={"ID":"3ec0b40d-04d4-486b-93bc-361c72d74aad","Type":"ContainerStarted","Data":"a0eee158c6bf8c5004192e9e5be063d271fb97b925d47309951edfc2eaee0faf"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.517327 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" event={"ID":"79a74909-eddf-4d5f-b43e-d6a790ff4d52","Type":"ContainerStarted","Data":"0aa4be7a10470e42963fb447502f96a0095936d4f2147ff0e52ecdfbaf803848"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.520124 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" event={"ID":"14c0f499-79e0-4090-bfaa-3d8606e04925","Type":"ContainerStarted","Data":"3c242eab06b2c3d5e3061bdbb6013fcc91dc991a50ee48950ddd84712670dd9c"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.524629 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" event={"ID":"bb494b13-9120-4ff9-8349-48568da9e990","Type":"ContainerStarted","Data":"1ffa9819ee8804a66ca1733e2dc5ea917f4b9efde70c6302e1174b649e0f3506"} Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.525003 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qng8x" podStartSLOduration=164.524984186 podStartE2EDuration="2m44.524984186s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:02.524895083 +0000 UTC m=+227.571278572" watchObservedRunningTime="2026-03-07 04:23:02.524984186 +0000 UTC m=+227.571367675" Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.525413 4689 patch_prober.go:28] interesting pod/console-operator-58897d9998-prpp8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.525475 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-prpp8" podUID="6b4a1ec5-fba3-4058-930e-96b000e4b052" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.570213 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.570525 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:03.070475209 +0000 UTC m=+228.116858698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.571427 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.573946 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:03.073922118 +0000 UTC m=+228.120305607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.672978 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.674578 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:03.174560737 +0000 UTC m=+228.220944226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.771773 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4rlvc"] Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.778182 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.778493 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:03.278478711 +0000 UTC m=+228.324862200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.812433 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv"] Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.827453 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tbpn9"] Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.867752 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm"] Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.879069 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.879226 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5"] Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.879384 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:03.379353155 +0000 UTC m=+228.425736644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.879586 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.880041 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:03.380034773 +0000 UTC m=+228.426418262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.881728 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-djkqv"] Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.890781 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl"] Mar 07 04:23:02 crc kubenswrapper[4689]: W0307 04:23:02.891077 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34925a57_7fd9_4a0e_955c_cbc1ad264fed.slice/crio-9f078a8bab75abb027bc1a12be13fb93ac26f298168013eddfaa45dd724f9604 WatchSource:0}: Error finding container 9f078a8bab75abb027bc1a12be13fb93ac26f298168013eddfaa45dd724f9604: Status 404 returned error can't find the container with id 9f078a8bab75abb027bc1a12be13fb93ac26f298168013eddfaa45dd724f9604 Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.895046 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m4p5r"] Mar 07 04:23:02 crc kubenswrapper[4689]: W0307 04:23:02.933832 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8541392_5b56_4a5d_ae7b_fd68ffdc2a85.slice/crio-8146c6461bd6cb326cf8401b7cb550e8d502ce1e99e56ff9751900b96f6a6444 WatchSource:0}: Error finding container 8146c6461bd6cb326cf8401b7cb550e8d502ce1e99e56ff9751900b96f6a6444: Status 404 returned error can't find the container with id 8146c6461bd6cb326cf8401b7cb550e8d502ce1e99e56ff9751900b96f6a6444 Mar 07 04:23:02 crc kubenswrapper[4689]: W0307 04:23:02.948024 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod320d5766_4cb7_4818_9072_86bfe7e7279d.slice/crio-e742d373c5f321123f07a4e3157100232d8a66c585f0649156c790295c1e9343 WatchSource:0}: Error finding container e742d373c5f321123f07a4e3157100232d8a66c585f0649156c790295c1e9343: Status 404 returned error can't find the container with id e742d373c5f321123f07a4e3157100232d8a66c585f0649156c790295c1e9343 Mar 07 04:23:02 crc kubenswrapper[4689]: W0307 04:23:02.963831 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod253d51b5_b44c_42ea_b259_aa9ff80888d6.slice/crio-5160e9041883a783eda9e4d7052412db61ae4c0c835d7b891f842190ccb8a99e WatchSource:0}: Error finding container 5160e9041883a783eda9e4d7052412db61ae4c0c835d7b891f842190ccb8a99e: Status 404 returned error can't find the container with id 5160e9041883a783eda9e4d7052412db61ae4c0c835d7b891f842190ccb8a99e Mar 07 04:23:02 crc kubenswrapper[4689]: I0307 04:23:02.980908 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:02 crc kubenswrapper[4689]: E0307 04:23:02.981364 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:03.481347139 +0000 UTC m=+228.527730628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.008071 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff"] Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.033223 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8ls5c"] Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.038488 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn"] Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.040100 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xwr6f"] Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.054830 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w"] Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.057331 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2"] Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.059770 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c6r5s"] Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.078968 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547622-4796h"] Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.083471 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc"] Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.084138 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:03 crc kubenswrapper[4689]: E0307 04:23:03.084599 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:03.584580926 +0000 UTC m=+228.630964415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.089034 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm"] Mar 07 04:23:03 crc kubenswrapper[4689]: W0307 04:23:03.099483 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee57bd24_197d_4722_9a1a_a73e914a0973.slice/crio-c7b60cc47154bbac78c68d2933903bd82aa6200134e8db004216a0a5e2acab27 WatchSource:0}: Error finding container c7b60cc47154bbac78c68d2933903bd82aa6200134e8db004216a0a5e2acab27: Status 404 returned error can't find the container with id c7b60cc47154bbac78c68d2933903bd82aa6200134e8db004216a0a5e2acab27 Mar 07 04:23:03 crc kubenswrapper[4689]: W0307 04:23:03.103356 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9053087d_d7b8_4835_a6f0_2e0bd3d16388.slice/crio-400311621d6ad4e8e0332a633a31c9f9e62ad769fbcbe9ac1d64e924dd1d92b9 WatchSource:0}: Error finding container 400311621d6ad4e8e0332a633a31c9f9e62ad769fbcbe9ac1d64e924dd1d92b9: Status 404 returned error can't find the container with id 400311621d6ad4e8e0332a633a31c9f9e62ad769fbcbe9ac1d64e924dd1d92b9 Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.157543 4689 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.192281 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:03 crc kubenswrapper[4689]: E0307 04:23:03.192769 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:03.692751871 +0000 UTC m=+228.739135360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.293594 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:03 crc kubenswrapper[4689]: E0307 04:23:03.293977 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:03.793962464 +0000 UTC m=+228.840345953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.394908 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:03 crc kubenswrapper[4689]: E0307 04:23:03.395777 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:03.895757173 +0000 UTC m=+228.942140662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.497656 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:03 crc kubenswrapper[4689]: E0307 04:23:03.498003 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:03.997990163 +0000 UTC m=+229.044373642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.536664 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6xlwc" event={"ID":"d065e112-052a-4e44-87f6-7713ebdfa2bd","Type":"ContainerStarted","Data":"2449b593ab34350947135b47e1a04865734556515be5cb45f93a5f05e247249b"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.554422 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6xlwc" podStartSLOduration=165.554395701 podStartE2EDuration="2m45.554395701s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:03.552629795 +0000 UTC m=+228.599013284" watchObservedRunningTime="2026-03-07 04:23:03.554395701 +0000 UTC m=+228.600779190" Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.555460 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" event={"ID":"bb494b13-9120-4ff9-8349-48568da9e990","Type":"ContainerStarted","Data":"c92e9bebdb1f119a418e39a64c0004952bdf820604ce4dddeac1322747e0f32d"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.569043 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4rlvc" event={"ID":"836ad923-c529-404d-82cb-6771c4932549","Type":"ContainerStarted","Data":"989645a9d0c448c9aea03aa96781b8dcb8d55d8cb4dafafdb29f2c7338f1f6ee"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.569094 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4rlvc" event={"ID":"836ad923-c529-404d-82cb-6771c4932549","Type":"ContainerStarted","Data":"fdd09f89b026c8e156bb20286b7cb077720c5c38dfcb6c1b32d6038cbe0324e6"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.571118 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" event={"ID":"79a74909-eddf-4d5f-b43e-d6a790ff4d52","Type":"ContainerStarted","Data":"30dcd1b1e28a4c3325d66a358ef407af24f3cdfe56dc33349f72ceb70b2fae0a"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.574065 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.582972 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" event={"ID":"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54","Type":"ContainerStarted","Data":"58904be1f4b9ee0204ecfba24259a9e8d418ad0a70710e1a81a3f6c9b081073c"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.598413 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547622-4796h" event={"ID":"33a94bd2-f479-403b-9c36-a708410864aa","Type":"ContainerStarted","Data":"cf868dfbf890c019a784d80bdd222eded7347837386dbf6a8bb49a62941384ee"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.598936 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:03 crc kubenswrapper[4689]: E0307 04:23:03.599317 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:04.099301549 +0000 UTC m=+229.145685038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.601397 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" event={"ID":"1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e","Type":"ContainerStarted","Data":"0e99a8d3888c0ae22046b5f86627be63f27305f075e03416785544c6c727edd6"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.603655 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" event={"ID":"34925a57-7fd9-4a0e-955c-cbc1ad264fed","Type":"ContainerStarted","Data":"bed442e4942d351868bda5af034a84ecbdded29a8f95dd2e8d36cfe2853ecd80"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.603674 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" event={"ID":"34925a57-7fd9-4a0e-955c-cbc1ad264fed","Type":"ContainerStarted","Data":"9f078a8bab75abb027bc1a12be13fb93ac26f298168013eddfaa45dd724f9604"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.610377 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" event={"ID":"3ec0b40d-04d4-486b-93bc-361c72d74aad","Type":"ContainerStarted","Data":"da2e10a8c7dee27646bbd5517c705a9f836404a817e262fe233547b09b32ecd4"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.614902 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" event={"ID":"468005f5-e421-4e6e-950e-c5232f78adc8","Type":"ContainerStarted","Data":"416e93736c3f020900819bf054715d558a7c8f90d24cf53e065953dea7c0484b"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.641794 4689 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9hbtn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.641884 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" podUID="79a74909-eddf-4d5f-b43e-d6a790ff4d52" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.642635 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5279" podStartSLOduration=164.642615214 podStartE2EDuration="2m44.642615214s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:03.642010098 +0000 UTC m=+228.688393587" watchObservedRunningTime="2026-03-07 04:23:03.642615214 +0000 UTC m=+228.688998703" Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.672833 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" event={"ID":"76eea8f9-8567-496d-ac53-575a25a140de","Type":"ContainerStarted","Data":"30241699ad5af1473e6b470ac38732b5128c4f0f066391f74df95aba3eab4101"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.675057 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.699045 4689 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m4p5r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.699539 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" podUID="76eea8f9-8567-496d-ac53-575a25a140de" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.699375 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" event={"ID":"978fa00f-eb81-4333-8589-a484358f3f09","Type":"ContainerStarted","Data":"d45084551c2b57dff7402d9ee981d9269cfbf0ce740077f4b9224d2caccee8a9"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.700585 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.704432 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-8ggcp" podStartSLOduration=164.704418665 podStartE2EDuration="2m44.704418665s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:03.672538049 +0000 UTC m=+228.718921538" watchObservedRunningTime="2026-03-07 04:23:03.704418665 +0000 UTC m=+228.750802154" Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.704961 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" podStartSLOduration=164.704957068 podStartE2EDuration="2m44.704957068s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:03.699921956 +0000 UTC m=+228.746305445" watchObservedRunningTime="2026-03-07 04:23:03.704957068 +0000 UTC m=+228.751340557" Mar 07 04:23:03 crc kubenswrapper[4689]: E0307 04:23:03.712475 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:04.212457365 +0000 UTC m=+229.258840854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.726789 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" event={"ID":"838dc182-e289-4769-98b0-e76ad62793c1","Type":"ContainerStarted","Data":"17380ced48880f1afae1a3f553f3ad689d087875a188c65f7c2c411c65537f1e"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.765333 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-zdmpn" podStartSLOduration=164.765312181 podStartE2EDuration="2m44.765312181s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:03.74966124 +0000 UTC m=+228.796044729" watchObservedRunningTime="2026-03-07 04:23:03.765312181 +0000 UTC m=+228.811695670" Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.775373 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" event={"ID":"a55cc042-6fa1-45a3-be75-9eb886b29a5a","Type":"ContainerStarted","Data":"a34657acb09181cdc41001141226488264d62a25017c26a53d7583f02e119474"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.793130 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn" event={"ID":"ee57bd24-197d-4722-9a1a-a73e914a0973","Type":"ContainerStarted","Data":"c7b60cc47154bbac78c68d2933903bd82aa6200134e8db004216a0a5e2acab27"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.807748 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:03 crc kubenswrapper[4689]: E0307 04:23:03.811234 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:04.311213454 +0000 UTC m=+229.357596943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.851255 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" podStartSLOduration=164.851238703 podStartE2EDuration="2m44.851238703s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:03.810030803 +0000 UTC m=+228.856414292" watchObservedRunningTime="2026-03-07 04:23:03.851238703 +0000 UTC m=+228.897622182" Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.860798 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" event={"ID":"8f06b111-b994-4bb2-b1f3-1033b5cde4aa","Type":"ContainerStarted","Data":"287e86b26bff5aa156e90e3e309bd90651edf28af45875306d12c3a4657d2e60"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.860842 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" event={"ID":"8f06b111-b994-4bb2-b1f3-1033b5cde4aa","Type":"ContainerStarted","Data":"67e532e559c1434e853f75aefd749a450ff6eb837800adffe4cb0fa4060193ef"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.875527 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" event={"ID":"320d5766-4cb7-4818-9072-86bfe7e7279d","Type":"ContainerStarted","Data":"34cf1dd6bba6fabfa972c6be4e0b3427b8e4b1f04fe98d75739ec72d98759d09"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.875572 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" event={"ID":"320d5766-4cb7-4818-9072-86bfe7e7279d","Type":"ContainerStarted","Data":"e742d373c5f321123f07a4e3157100232d8a66c585f0649156c790295c1e9343"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.900456 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" event={"ID":"14c0f499-79e0-4090-bfaa-3d8606e04925","Type":"ContainerStarted","Data":"51e540eedbace8e6036dd6e9e2ed481756ce1d9251d401e862aa5333dd9cfaaf"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.915829 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" podStartSLOduration=165.915809826 podStartE2EDuration="2m45.915809826s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:03.912864808 +0000 UTC m=+228.959248297" watchObservedRunningTime="2026-03-07 04:23:03.915809826 +0000 UTC m=+228.962193315" Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.916347 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn" podStartSLOduration=164.91633931 podStartE2EDuration="2m44.91633931s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:03.851658874 +0000 UTC m=+228.898042383" watchObservedRunningTime="2026-03-07 04:23:03.91633931 +0000 UTC m=+228.962722789" Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.931885 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:03 crc kubenswrapper[4689]: E0307 04:23:03.933508 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:04.433488829 +0000 UTC m=+229.479872318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.935486 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8ls5c" event={"ID":"9053087d-d7b8-4835-a6f0-2e0bd3d16388","Type":"ContainerStarted","Data":"400311621d6ad4e8e0332a633a31c9f9e62ad769fbcbe9ac1d64e924dd1d92b9"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.946931 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" event={"ID":"774b5998-29de-4546-937e-b5d2ee0b27d4","Type":"ContainerStarted","Data":"6e2547690db43eea8c6c506e710b7060fcbfd1891ff3c8f30b8965a19428ce3e"} Mar 07 04:23:03 crc kubenswrapper[4689]: I0307 04:23:03.947124 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.027124 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-zw6mx" podStartSLOduration=166.027100133 podStartE2EDuration="2m46.027100133s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:04.020632573 +0000 UTC m=+229.067016062" watchObservedRunningTime="2026-03-07 04:23:04.027100133 +0000 UTC m=+229.073483622" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.028997 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nnnmk" event={"ID":"423b5174-7bed-4fba-af44-51abd9188676","Type":"ContainerStarted","Data":"d84167c76cbc8710b8034538c3886fcb603029f4feef28624b39b186417a9c82"} Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.034246 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nnnmk" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.044490 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.044773 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:04.544745386 +0000 UTC m=+229.591128875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.047618 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.048716 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:04.54869305 +0000 UTC m=+229.595076669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.053003 4689 patch_prober.go:28] interesting pod/downloads-7954f5f757-nnnmk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.053050 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc" event={"ID":"fcaa0a81-da24-4346-b670-7ad3a516d8f6","Type":"ContainerStarted","Data":"3cca3d9abad2497bb7fd591a55b4265bca1d2dbe12f2ce23093c5f238ce2fb1f"} Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.053060 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nnnmk" podUID="423b5174-7bed-4fba-af44-51abd9188676" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.060259 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" event={"ID":"337ffd12-61ea-489d-94a6-4424e7eae3af","Type":"ContainerStarted","Data":"e6fe567cc567a4641b4dada567ac29ce99cf9d7f2131eed9b6351e341cd657a5"} Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.075288 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" event={"ID":"e244dd83-cd20-40f8-a639-3164577c7316","Type":"ContainerStarted","Data":"cd113479d0ad0f6d64cae2979d06a7cd29945e01f27c88a61931320d12fac5a5"} Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.081872 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" podStartSLOduration=166.081852728 podStartE2EDuration="2m46.081852728s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:04.080065472 +0000 UTC m=+229.126448961" watchObservedRunningTime="2026-03-07 04:23:04.081852728 +0000 UTC m=+229.128236217" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.102269 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tbpn9" event={"ID":"030a2c5c-27d3-4eb6-889c-1888b80e9eef","Type":"ContainerStarted","Data":"9c0ccb4794e08d948230f0bb9ac80b19114dfa774eead6709f008084a6340fb0"} Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.120880 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xwr6f" event={"ID":"340f2b24-7f0e-4198-bb0c-6c4f50e4fac9","Type":"ContainerStarted","Data":"4610cf9f9e25929f6f9eca7806900cef0bc98d0684c8c342c6de43a41aae03e9"} Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.148557 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.148888 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:04.648873506 +0000 UTC m=+229.695256995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.152932 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" event={"ID":"1e4bbf5e-dcd1-4e37-ab88-1ce0def71019","Type":"ContainerStarted","Data":"2b9ef4e05f9658e452919e3f453fc47c016f33ed2a522f8ebc1fcaf1502d49f5"} Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.157084 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-nnnmk" podStartSLOduration=166.15705939 podStartE2EDuration="2m46.15705939s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:04.144747007 +0000 UTC m=+229.191130496" watchObservedRunningTime="2026-03-07 04:23:04.15705939 +0000 UTC m=+229.203442879" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.174111 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7dvxk" event={"ID":"3f4cf0c7-db05-4fc8-b538-199d3d4a4824","Type":"ContainerStarted","Data":"d8f1a58d179f5ed91337d4db67bf3949013c993aa7ffebd557ab07ba05a88c58"} Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.189029 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" event={"ID":"c8541392-5b56-4a5d-ae7b-fd68ffdc2a85","Type":"ContainerStarted","Data":"b5980c1a25f4be92a6f76fe908d95490549bff75cc7270a1d60097d043ed9a0e"} Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.189422 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" event={"ID":"c8541392-5b56-4a5d-ae7b-fd68ffdc2a85","Type":"ContainerStarted","Data":"8146c6461bd6cb326cf8401b7cb550e8d502ce1e99e56ff9751900b96f6a6444"} Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.230433 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" podStartSLOduration=166.230415323 podStartE2EDuration="2m46.230415323s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:04.175821722 +0000 UTC m=+229.222205211" watchObservedRunningTime="2026-03-07 04:23:04.230415323 +0000 UTC m=+229.276798812" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.232272 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" event={"ID":"253d51b5-b44c-42ea-b259-aa9ff80888d6","Type":"ContainerStarted","Data":"f8a0a0385e8432a767334712171b3c0e327f183458aabd547139b91940977d67"} Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.232313 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" event={"ID":"253d51b5-b44c-42ea-b259-aa9ff80888d6","Type":"ContainerStarted","Data":"5160e9041883a783eda9e4d7052412db61ae4c0c835d7b891f842190ccb8a99e"} Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.233693 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-w89ns" podStartSLOduration=165.233681059 podStartE2EDuration="2m45.233681059s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:04.22992196 +0000 UTC m=+229.276305449" watchObservedRunningTime="2026-03-07 04:23:04.233681059 +0000 UTC m=+229.280064548" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.251729 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.252146 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:04.752128243 +0000 UTC m=+229.798511732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.280267 4689 ???:1] "http: TLS handshake error from 192.168.126.11:39348: no serving certificate available for the kubelet" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.281727 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-7dvxk" podStartSLOduration=165.281711038 podStartE2EDuration="2m45.281711038s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:04.280512896 +0000 UTC m=+229.326896385" watchObservedRunningTime="2026-03-07 04:23:04.281711038 +0000 UTC m=+229.328094527" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.314840 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n8r6f" podStartSLOduration=165.314820526 podStartE2EDuration="2m45.314820526s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:04.314679982 +0000 UTC m=+229.361063471" watchObservedRunningTime="2026-03-07 04:23:04.314820526 +0000 UTC m=+229.361204015" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.352475 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.352857 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:04.852838883 +0000 UTC m=+229.899222372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.353425 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.360982 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:04.860966776 +0000 UTC m=+229.907350255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.370962 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xwr6f" podStartSLOduration=8.370931017 podStartE2EDuration="8.370931017s" podCreationTimestamp="2026-03-07 04:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:04.362041503 +0000 UTC m=+229.408424992" watchObservedRunningTime="2026-03-07 04:23:04.370931017 +0000 UTC m=+229.417314506" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.386234 4689 ???:1] "http: TLS handshake error from 192.168.126.11:39356: no serving certificate available for the kubelet" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.393028 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-djkqv" podStartSLOduration=165.393000095 podStartE2EDuration="2m45.393000095s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:04.39200933 +0000 UTC m=+229.438392829" watchObservedRunningTime="2026-03-07 04:23:04.393000095 +0000 UTC m=+229.439383584" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.458634 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.459020 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:04.958998906 +0000 UTC m=+230.005382385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.493445 4689 ???:1] "http: TLS handshake error from 192.168.126.11:39364: no serving certificate available for the kubelet" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.562948 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.563847 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.063834093 +0000 UTC m=+230.110217582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.582464 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.582900 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.606043 4689 patch_prober.go:28] interesting pod/apiserver-76f77b778f-h6hq2 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.606125 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" podUID="8f06b111-b994-4bb2-b1f3-1033b5cde4aa" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.610456 4689 ???:1] "http: TLS handshake error from 192.168.126.11:39372: no serving certificate available for the kubelet" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.664871 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.665103 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.165064978 +0000 UTC m=+230.211448467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.665256 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.665616 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.165601111 +0000 UTC m=+230.211984590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.693777 4689 ???:1] "http: TLS handshake error from 192.168.126.11:39374: no serving certificate available for the kubelet" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.766797 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.766996 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.266964129 +0000 UTC m=+230.313347618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.767434 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.767837 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.267828551 +0000 UTC m=+230.314212040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.798025 4689 ???:1] "http: TLS handshake error from 192.168.126.11:39382: no serving certificate available for the kubelet" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.831570 4689 ???:1] "http: TLS handshake error from 192.168.126.11:39388: no serving certificate available for the kubelet" Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.868403 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.868668 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.368622434 +0000 UTC m=+230.415005923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.869021 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.869462 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.369448946 +0000 UTC m=+230.415832485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:04 crc kubenswrapper[4689]: I0307 04:23:04.970632 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:04 crc kubenswrapper[4689]: E0307 04:23:04.970980 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.470963326 +0000 UTC m=+230.517346815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.011915 4689 ???:1] "http: TLS handshake error from 192.168.126.11:39390: no serving certificate available for the kubelet" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.072748 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:05 crc kubenswrapper[4689]: E0307 04:23:05.073258 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.573228747 +0000 UTC m=+230.619612236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.097304 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.105246 4689 patch_prober.go:28] interesting pod/router-default-5444994796-7dvxk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 04:23:05 crc kubenswrapper[4689]: [-]has-synced failed: reason withheld Mar 07 04:23:05 crc kubenswrapper[4689]: [+]process-running ok Mar 07 04:23:05 crc kubenswrapper[4689]: healthz check failed Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.105331 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7dvxk" podUID="3f4cf0c7-db05-4fc8-b538-199d3d4a4824" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.173610 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:05 crc kubenswrapper[4689]: E0307 04:23:05.173860 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.673818794 +0000 UTC m=+230.720202283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.174141 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:05 crc kubenswrapper[4689]: E0307 04:23:05.174610 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.674601895 +0000 UTC m=+230.720985384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.264710 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tbpn9" event={"ID":"030a2c5c-27d3-4eb6-889c-1888b80e9eef","Type":"ContainerStarted","Data":"88b548c5b2c6f5618d99754408075c0226cb3210b102e1910db9b2251c518d41"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.265098 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tbpn9" event={"ID":"030a2c5c-27d3-4eb6-889c-1888b80e9eef","Type":"ContainerStarted","Data":"56a15c92ed2c7e5295207c3cd885c25ffbb73f958847675da99fb9c48182ea6a"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.272475 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" event={"ID":"76eea8f9-8567-496d-ac53-575a25a140de","Type":"ContainerStarted","Data":"d04fe09a98945d979d230cec89cd839e6b709af2391b7e38db6d8a25f8aab189"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.274096 4689 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m4p5r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.274158 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" podUID="76eea8f9-8567-496d-ac53-575a25a140de" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.277985 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:05 crc kubenswrapper[4689]: E0307 04:23:05.278423 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.778406056 +0000 UTC m=+230.824789535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.282134 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xwr6f" event={"ID":"340f2b24-7f0e-4198-bb0c-6c4f50e4fac9","Type":"ContainerStarted","Data":"78c9d457acd78eab73108164923001cd6610e1812292ab9b2c5705559bfbdefd"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.288605 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n9g9l" podStartSLOduration=166.288588583 podStartE2EDuration="2m46.288588583s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:04.47633427 +0000 UTC m=+229.522717759" watchObservedRunningTime="2026-03-07 04:23:05.288588583 +0000 UTC m=+230.334972072" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.309784 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc" event={"ID":"fcaa0a81-da24-4346-b670-7ad3a516d8f6","Type":"ContainerStarted","Data":"818a7fef2a614e642198215c836c9d9a77d71a56917d6e9c1e0364966205f317"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.309839 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc" event={"ID":"fcaa0a81-da24-4346-b670-7ad3a516d8f6","Type":"ContainerStarted","Data":"a86b988d6643761c73a3dc9851e00b68496b7fbe6c4c0d39ea66c638ec31adf6"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.320875 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" event={"ID":"468005f5-e421-4e6e-950e-c5232f78adc8","Type":"ContainerStarted","Data":"8b50afa05768440b941408c49177161869f6a0ff6ad434b28c23583e46e5c938"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.349275 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8ls5c" event={"ID":"9053087d-d7b8-4835-a6f0-2e0bd3d16388","Type":"ContainerStarted","Data":"36033ab440698484925a8a694cfbc48724c7215096f9f0103ee25653322bc2b5"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.349926 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-8ls5c" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.354461 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7vcmc" podStartSLOduration=167.354447579 podStartE2EDuration="2m47.354447579s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:05.352827727 +0000 UTC m=+230.399211216" watchObservedRunningTime="2026-03-07 04:23:05.354447579 +0000 UTC m=+230.400831068" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.354847 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-tbpn9" podStartSLOduration=166.354843699 podStartE2EDuration="2m46.354843699s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:05.288527291 +0000 UTC m=+230.334910780" watchObservedRunningTime="2026-03-07 04:23:05.354843699 +0000 UTC m=+230.401227188" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.368577 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfcf7" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.376835 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" event={"ID":"fe2e7665-098b-4338-9ff3-f936514ebbb9","Type":"ContainerStarted","Data":"86ef74da05a585d056179ab8dcd41180bebff234e504d40ec123cdea43547838"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.378708 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" event={"ID":"a55cc042-6fa1-45a3-be75-9eb886b29a5a","Type":"ContainerStarted","Data":"becc6283913fae74945ab0a23a882487f852795ace0bbe7e4eab932933d8ef54"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.378730 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" event={"ID":"a55cc042-6fa1-45a3-be75-9eb886b29a5a","Type":"ContainerStarted","Data":"a6ce79afacc61acb633c39a8a9b603b1532e534cc70cbd924e4dcf5148e8e108"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.379113 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.384062 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:05 crc kubenswrapper[4689]: E0307 04:23:05.384438 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.884423015 +0000 UTC m=+230.930806504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.389103 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" event={"ID":"1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e","Type":"ContainerStarted","Data":"33771df8a9f19054831c8aa0fdcddb96aa8611fb9202be65432c844a42da7540"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.390420 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.394785 4689 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lvzm2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.394832 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" podUID="1d58d90d-d7d7-4d66-a8b5-9584a8b74a8e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.394774 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8ls5c" podStartSLOduration=8.394757706 podStartE2EDuration="8.394757706s" podCreationTimestamp="2026-03-07 04:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:05.39376549 +0000 UTC m=+230.440148979" watchObservedRunningTime="2026-03-07 04:23:05.394757706 +0000 UTC m=+230.441141185" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.399947 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" event={"ID":"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54","Type":"ContainerStarted","Data":"8c03c3237fba7ac7e767ced1c669a960a52f91cb7504458f3c49fb13a794b3be"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.400016 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" event={"ID":"dce28ba1-7f97-47ad-8ba4-0b6a396e3d54","Type":"ContainerStarted","Data":"6e783374095de7c52b6720858567da1a0c1250fe55e1c8e047921083e27e974c"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.411521 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wblnn" event={"ID":"ee57bd24-197d-4722-9a1a-a73e914a0973","Type":"ContainerStarted","Data":"1dc961be2e6220d92798a78212bced5ec03bee5ef05115a8355023c365737a29"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.428995 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" event={"ID":"337ffd12-61ea-489d-94a6-4424e7eae3af","Type":"ContainerStarted","Data":"126d235a4ff33d5798262ca6c1010e5da3c31a650e33b291e6b5251468588c5b"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.430021 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.434225 4689 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-sbfsm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.434269 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" podUID="337ffd12-61ea-489d-94a6-4424e7eae3af" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.444061 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4z4gl" podStartSLOduration=166.444040648 podStartE2EDuration="2m46.444040648s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:05.438921613 +0000 UTC m=+230.485305092" watchObservedRunningTime="2026-03-07 04:23:05.444040648 +0000 UTC m=+230.490424137" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.446101 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" event={"ID":"c8541392-5b56-4a5d-ae7b-fd68ffdc2a85","Type":"ContainerStarted","Data":"41551f608a03ffd1982420d436ee090e5f1e63e4a086abecb8816f284c5673f8"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.455366 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4rlvc" event={"ID":"836ad923-c529-404d-82cb-6771c4932549","Type":"ContainerStarted","Data":"1662cf5e100f195a4f0a8cbeca8191933e6b9f4c120d128a8c9330faf7ce7dc4"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.464852 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" podStartSLOduration=166.464834533 podStartE2EDuration="2m46.464834533s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:05.462604794 +0000 UTC m=+230.508988283" watchObservedRunningTime="2026-03-07 04:23:05.464834533 +0000 UTC m=+230.511218012" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.478694 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" event={"ID":"34925a57-7fd9-4a0e-955c-cbc1ad264fed","Type":"ContainerStarted","Data":"c7890e3793bf4aca56b93fe7f78ad19a14a584fa2e8c37680756abd52153ccb4"} Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.482515 4689 patch_prober.go:28] interesting pod/downloads-7954f5f757-nnnmk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.482563 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nnnmk" podUID="423b5174-7bed-4fba-af44-51abd9188676" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.484461 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:05 crc kubenswrapper[4689]: E0307 04:23:05.493006 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:05.992984791 +0000 UTC m=+231.039368280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.502492 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9hbtn" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.590273 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:05 crc kubenswrapper[4689]: E0307 04:23:05.591589 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.091576095 +0000 UTC m=+231.137959584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.629016 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" podStartSLOduration=166.628990996 podStartE2EDuration="2m46.628990996s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:05.624433657 +0000 UTC m=+230.670817146" watchObservedRunningTime="2026-03-07 04:23:05.628990996 +0000 UTC m=+230.675374485" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.629133 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" podStartSLOduration=166.62912983 podStartE2EDuration="2m46.62912983s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:05.526578902 +0000 UTC m=+230.572962391" watchObservedRunningTime="2026-03-07 04:23:05.62912983 +0000 UTC m=+230.675513319" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.691634 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:05 crc kubenswrapper[4689]: E0307 04:23:05.691976 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.191957697 +0000 UTC m=+231.238341186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.752982 4689 ???:1] "http: TLS handshake error from 192.168.126.11:39394: no serving certificate available for the kubelet" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.769319 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" podStartSLOduration=166.769297344 podStartE2EDuration="2m46.769297344s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:05.764687224 +0000 UTC m=+230.811070713" watchObservedRunningTime="2026-03-07 04:23:05.769297344 +0000 UTC m=+230.815680833" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.769878 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b8ff" podStartSLOduration=166.76987149 podStartE2EDuration="2m46.76987149s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:05.729771038 +0000 UTC m=+230.776154527" watchObservedRunningTime="2026-03-07 04:23:05.76987149 +0000 UTC m=+230.816254979" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.796307 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:05 crc kubenswrapper[4689]: E0307 04:23:05.796705 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.296691233 +0000 UTC m=+231.343074722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.895530 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5drsv" podStartSLOduration=166.895501193 podStartE2EDuration="2m46.895501193s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:05.827937571 +0000 UTC m=+230.874321060" watchObservedRunningTime="2026-03-07 04:23:05.895501193 +0000 UTC m=+230.941884682" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.897343 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:05 crc kubenswrapper[4689]: E0307 04:23:05.898113 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.398095321 +0000 UTC m=+231.444478810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.935162 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4rlvc" podStartSLOduration=166.935137712 podStartE2EDuration="2m46.935137712s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:05.881429864 +0000 UTC m=+230.927813343" watchObservedRunningTime="2026-03-07 04:23:05.935137712 +0000 UTC m=+230.981521201" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.966267 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-b5kjm" podStartSLOduration=166.966242517 podStartE2EDuration="2m46.966242517s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:05.916669588 +0000 UTC m=+230.963053077" watchObservedRunningTime="2026-03-07 04:23:05.966242517 +0000 UTC m=+231.012626006" Mar 07 04:23:05 crc kubenswrapper[4689]: I0307 04:23:05.999365 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:06 crc kubenswrapper[4689]: E0307 04:23:05.999727 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.499712095 +0000 UTC m=+231.546095584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.082058 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fmghp"] Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.083013 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.088674 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.100449 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:06 crc kubenswrapper[4689]: E0307 04:23:06.100808 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.600791784 +0000 UTC m=+231.647175273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.111595 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fmghp"] Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.113471 4689 patch_prober.go:28] interesting pod/router-default-5444994796-7dvxk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 04:23:06 crc kubenswrapper[4689]: [-]has-synced failed: reason withheld Mar 07 04:23:06 crc kubenswrapper[4689]: [+]process-running ok Mar 07 04:23:06 crc kubenswrapper[4689]: healthz check failed Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.113517 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7dvxk" podUID="3f4cf0c7-db05-4fc8-b538-199d3d4a4824" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.201721 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82c3040-48ed-473b-9386-d58d13364f29-utilities\") pod \"certified-operators-fmghp\" (UID: \"c82c3040-48ed-473b-9386-d58d13364f29\") " pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.201760 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrct5\" (UniqueName: \"kubernetes.io/projected/c82c3040-48ed-473b-9386-d58d13364f29-kube-api-access-wrct5\") pod \"certified-operators-fmghp\" (UID: \"c82c3040-48ed-473b-9386-d58d13364f29\") " pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.201820 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82c3040-48ed-473b-9386-d58d13364f29-catalog-content\") pod \"certified-operators-fmghp\" (UID: \"c82c3040-48ed-473b-9386-d58d13364f29\") " pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.201864 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:06 crc kubenswrapper[4689]: E0307 04:23:06.202132 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.702120691 +0000 UTC m=+231.748504180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.292754 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hvrwc"] Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.294094 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.303266 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9x8l6"] Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.303473 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" podUID="7d79bc2b-a849-4d82-bc59-197431e014db" containerName="controller-manager" containerID="cri-o://673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd" gracePeriod=30 Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.305043 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.305533 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.305742 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82c3040-48ed-473b-9386-d58d13364f29-catalog-content\") pod \"certified-operators-fmghp\" (UID: \"c82c3040-48ed-473b-9386-d58d13364f29\") " pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.305818 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82c3040-48ed-473b-9386-d58d13364f29-utilities\") pod \"certified-operators-fmghp\" (UID: \"c82c3040-48ed-473b-9386-d58d13364f29\") " pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.305838 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrct5\" (UniqueName: \"kubernetes.io/projected/c82c3040-48ed-473b-9386-d58d13364f29-kube-api-access-wrct5\") pod \"certified-operators-fmghp\" (UID: \"c82c3040-48ed-473b-9386-d58d13364f29\") " pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:23:06 crc kubenswrapper[4689]: E0307 04:23:06.306090 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.806077116 +0000 UTC m=+231.852460605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.306888 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82c3040-48ed-473b-9386-d58d13364f29-catalog-content\") pod \"certified-operators-fmghp\" (UID: \"c82c3040-48ed-473b-9386-d58d13364f29\") " pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.307097 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82c3040-48ed-473b-9386-d58d13364f29-utilities\") pod \"certified-operators-fmghp\" (UID: \"c82c3040-48ed-473b-9386-d58d13364f29\") " pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.343046 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrct5\" (UniqueName: \"kubernetes.io/projected/c82c3040-48ed-473b-9386-d58d13364f29-kube-api-access-wrct5\") pod \"certified-operators-fmghp\" (UID: \"c82c3040-48ed-473b-9386-d58d13364f29\") " pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.364122 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk"] Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.365103 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" podUID="afea7082-9f6d-4c1f-a9be-ad1444e1459e" containerName="route-controller-manager" containerID="cri-o://6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d" gracePeriod=30 Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.378843 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvrwc"] Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.407406 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgwsh\" (UniqueName: \"kubernetes.io/projected/fd0c8e82-4247-4dbb-b1a5-4a258259199c-kube-api-access-bgwsh\") pod \"community-operators-hvrwc\" (UID: \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\") " pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.407482 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd0c8e82-4247-4dbb-b1a5-4a258259199c-utilities\") pod \"community-operators-hvrwc\" (UID: \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\") " pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.407558 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.407587 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd0c8e82-4247-4dbb-b1a5-4a258259199c-catalog-content\") pod \"community-operators-hvrwc\" (UID: \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\") " pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:23:06 crc kubenswrapper[4689]: E0307 04:23:06.407925 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:06.907913045 +0000 UTC m=+231.954296534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.415572 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.511871 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.512413 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgwsh\" (UniqueName: \"kubernetes.io/projected/fd0c8e82-4247-4dbb-b1a5-4a258259199c-kube-api-access-bgwsh\") pod \"community-operators-hvrwc\" (UID: \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\") " pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.512463 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd0c8e82-4247-4dbb-b1a5-4a258259199c-utilities\") pod \"community-operators-hvrwc\" (UID: \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\") " pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.512537 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd0c8e82-4247-4dbb-b1a5-4a258259199c-catalog-content\") pod \"community-operators-hvrwc\" (UID: \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\") " pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.512949 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd0c8e82-4247-4dbb-b1a5-4a258259199c-catalog-content\") pod \"community-operators-hvrwc\" (UID: \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\") " pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:23:06 crc kubenswrapper[4689]: E0307 04:23:06.513024 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:07.013008831 +0000 UTC m=+232.059392320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.513491 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd0c8e82-4247-4dbb-b1a5-4a258259199c-utilities\") pod \"community-operators-hvrwc\" (UID: \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\") " pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.527881 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-chw2s"] Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.538297 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8ls5c" event={"ID":"9053087d-d7b8-4835-a6f0-2e0bd3d16388","Type":"ContainerStarted","Data":"93f9e81a38b037480f4af2091dd28f0779c43919386fb0dd2de8af50b928cef4"} Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.538405 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.543451 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgwsh\" (UniqueName: \"kubernetes.io/projected/fd0c8e82-4247-4dbb-b1a5-4a258259199c-kube-api-access-bgwsh\") pod \"community-operators-hvrwc\" (UID: \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\") " pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.556018 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" event={"ID":"838dc182-e289-4769-98b0-e76ad62793c1","Type":"ContainerStarted","Data":"b381ca0dc1bdd73d447aee224ba129d744548ba03a89c1e450dd5c2fbba34c60"} Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.573472 4689 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m4p5r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.573539 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" podUID="76eea8f9-8567-496d-ac53-575a25a140de" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.601661 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvzm2" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.624197 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99bbfad4-6baf-4ada-88b8-158f49957da5-catalog-content\") pod \"certified-operators-chw2s\" (UID: \"99bbfad4-6baf-4ada-88b8-158f49957da5\") " pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.624346 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99bbfad4-6baf-4ada-88b8-158f49957da5-utilities\") pod \"certified-operators-chw2s\" (UID: \"99bbfad4-6baf-4ada-88b8-158f49957da5\") " pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.624570 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfn55\" (UniqueName: \"kubernetes.io/projected/99bbfad4-6baf-4ada-88b8-158f49957da5-kube-api-access-tfn55\") pod \"certified-operators-chw2s\" (UID: \"99bbfad4-6baf-4ada-88b8-158f49957da5\") " pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.624761 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:06 crc kubenswrapper[4689]: E0307 04:23:06.632606 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:07.132590196 +0000 UTC m=+232.178973685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.635017 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.644482 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-chw2s"] Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.696799 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gc2hb"] Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.698110 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.709381 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gc2hb"] Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.727494 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.727801 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxh66\" (UniqueName: \"kubernetes.io/projected/d4a365d2-d74f-4675-b789-27bafa93fbff-kube-api-access-kxh66\") pod \"community-operators-gc2hb\" (UID: \"d4a365d2-d74f-4675-b789-27bafa93fbff\") " pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.727835 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a365d2-d74f-4675-b789-27bafa93fbff-utilities\") pod \"community-operators-gc2hb\" (UID: \"d4a365d2-d74f-4675-b789-27bafa93fbff\") " pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.727858 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a365d2-d74f-4675-b789-27bafa93fbff-catalog-content\") pod \"community-operators-gc2hb\" (UID: \"d4a365d2-d74f-4675-b789-27bafa93fbff\") " pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.727879 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99bbfad4-6baf-4ada-88b8-158f49957da5-catalog-content\") pod \"certified-operators-chw2s\" (UID: \"99bbfad4-6baf-4ada-88b8-158f49957da5\") " pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.727915 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99bbfad4-6baf-4ada-88b8-158f49957da5-utilities\") pod \"certified-operators-chw2s\" (UID: \"99bbfad4-6baf-4ada-88b8-158f49957da5\") " pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.727966 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfn55\" (UniqueName: \"kubernetes.io/projected/99bbfad4-6baf-4ada-88b8-158f49957da5-kube-api-access-tfn55\") pod \"certified-operators-chw2s\" (UID: \"99bbfad4-6baf-4ada-88b8-158f49957da5\") " pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:23:06 crc kubenswrapper[4689]: E0307 04:23:06.728210 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:07.228162991 +0000 UTC m=+232.274546520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.728639 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99bbfad4-6baf-4ada-88b8-158f49957da5-catalog-content\") pod \"certified-operators-chw2s\" (UID: \"99bbfad4-6baf-4ada-88b8-158f49957da5\") " pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.729093 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99bbfad4-6baf-4ada-88b8-158f49957da5-utilities\") pod \"certified-operators-chw2s\" (UID: \"99bbfad4-6baf-4ada-88b8-158f49957da5\") " pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.803735 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfn55\" (UniqueName: \"kubernetes.io/projected/99bbfad4-6baf-4ada-88b8-158f49957da5-kube-api-access-tfn55\") pod \"certified-operators-chw2s\" (UID: \"99bbfad4-6baf-4ada-88b8-158f49957da5\") " pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.830994 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.831045 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.831091 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.831137 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.831162 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxh66\" (UniqueName: \"kubernetes.io/projected/d4a365d2-d74f-4675-b789-27bafa93fbff-kube-api-access-kxh66\") pod \"community-operators-gc2hb\" (UID: \"d4a365d2-d74f-4675-b789-27bafa93fbff\") " pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.831200 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a365d2-d74f-4675-b789-27bafa93fbff-utilities\") pod \"community-operators-gc2hb\" (UID: \"d4a365d2-d74f-4675-b789-27bafa93fbff\") " pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.831221 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a365d2-d74f-4675-b789-27bafa93fbff-catalog-content\") pod \"community-operators-gc2hb\" (UID: \"d4a365d2-d74f-4675-b789-27bafa93fbff\") " pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.831264 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.831829 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a365d2-d74f-4675-b789-27bafa93fbff-utilities\") pod \"community-operators-gc2hb\" (UID: \"d4a365d2-d74f-4675-b789-27bafa93fbff\") " pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:23:06 crc kubenswrapper[4689]: E0307 04:23:06.831928 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:07.3319063 +0000 UTC m=+232.378289789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.834381 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a365d2-d74f-4675-b789-27bafa93fbff-catalog-content\") pod \"community-operators-gc2hb\" (UID: \"d4a365d2-d74f-4675-b789-27bafa93fbff\") " pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.838265 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.838716 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.845989 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.861743 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.884621 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxh66\" (UniqueName: \"kubernetes.io/projected/d4a365d2-d74f-4675-b789-27bafa93fbff-kube-api-access-kxh66\") pod \"community-operators-gc2hb\" (UID: \"d4a365d2-d74f-4675-b789-27bafa93fbff\") " pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.921107 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sbfsm" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.921615 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.936628 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:06 crc kubenswrapper[4689]: E0307 04:23:06.936768 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:07.436740439 +0000 UTC m=+232.483123928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:06 crc kubenswrapper[4689]: I0307 04:23:06.936937 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:06 crc kubenswrapper[4689]: E0307 04:23:06.940334 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:07.440318153 +0000 UTC m=+232.486701642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.043556 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:07 crc kubenswrapper[4689]: E0307 04:23:07.043878 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:07.543859237 +0000 UTC m=+232.590242726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.054493 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.075252 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.087208 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.108069 4689 patch_prober.go:28] interesting pod/router-default-5444994796-7dvxk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 04:23:07 crc kubenswrapper[4689]: [-]has-synced failed: reason withheld Mar 07 04:23:07 crc kubenswrapper[4689]: [+]process-running ok Mar 07 04:23:07 crc kubenswrapper[4689]: healthz check failed Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.108101 4689 ???:1] "http: TLS handshake error from 192.168.126.11:39408: no serving certificate available for the kubelet" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.108125 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7dvxk" podUID="3f4cf0c7-db05-4fc8-b538-199d3d4a4824" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.113438 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.145016 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:07 crc kubenswrapper[4689]: E0307 04:23:07.145441 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:07.645425079 +0000 UTC m=+232.691808568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.167574 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fmghp"] Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.178768 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:23:07 crc kubenswrapper[4689]: W0307 04:23:07.238528 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc82c3040_48ed_473b_9386_d58d13364f29.slice/crio-6b09126c0edaa1f93a11d78c4a374f15c6f1c1d644b4cf7d6c40b284ad01e9b0 WatchSource:0}: Error finding container 6b09126c0edaa1f93a11d78c4a374f15c6f1c1d644b4cf7d6c40b284ad01e9b0: Status 404 returned error can't find the container with id 6b09126c0edaa1f93a11d78c4a374f15c6f1c1d644b4cf7d6c40b284ad01e9b0 Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.241493 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.246000 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.246060 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-config\") pod \"7d79bc2b-a849-4d82-bc59-197431e014db\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.246099 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qq6k\" (UniqueName: \"kubernetes.io/projected/7d79bc2b-a849-4d82-bc59-197431e014db-kube-api-access-4qq6k\") pod \"7d79bc2b-a849-4d82-bc59-197431e014db\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.246116 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d79bc2b-a849-4d82-bc59-197431e014db-serving-cert\") pod \"7d79bc2b-a849-4d82-bc59-197431e014db\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.246134 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-client-ca\") pod \"7d79bc2b-a849-4d82-bc59-197431e014db\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.246184 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-proxy-ca-bundles\") pod \"7d79bc2b-a849-4d82-bc59-197431e014db\" (UID: \"7d79bc2b-a849-4d82-bc59-197431e014db\") " Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.247499 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7d79bc2b-a849-4d82-bc59-197431e014db" (UID: "7d79bc2b-a849-4d82-bc59-197431e014db"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:07 crc kubenswrapper[4689]: E0307 04:23:07.247564 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:07.747551486 +0000 UTC m=+232.793934975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.249267 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-client-ca" (OuterVolumeSpecName: "client-ca") pod "7d79bc2b-a849-4d82-bc59-197431e014db" (UID: "7d79bc2b-a849-4d82-bc59-197431e014db"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.249507 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-config" (OuterVolumeSpecName: "config") pod "7d79bc2b-a849-4d82-bc59-197431e014db" (UID: "7d79bc2b-a849-4d82-bc59-197431e014db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.271574 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d79bc2b-a849-4d82-bc59-197431e014db-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7d79bc2b-a849-4d82-bc59-197431e014db" (UID: "7d79bc2b-a849-4d82-bc59-197431e014db"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.282009 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d79bc2b-a849-4d82-bc59-197431e014db-kube-api-access-4qq6k" (OuterVolumeSpecName: "kube-api-access-4qq6k") pod "7d79bc2b-a849-4d82-bc59-197431e014db" (UID: "7d79bc2b-a849-4d82-bc59-197431e014db"). InnerVolumeSpecName "kube-api-access-4qq6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.348271 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afea7082-9f6d-4c1f-a9be-ad1444e1459e-client-ca\") pod \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.348348 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h52md\" (UniqueName: \"kubernetes.io/projected/afea7082-9f6d-4c1f-a9be-ad1444e1459e-kube-api-access-h52md\") pod \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.348375 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afea7082-9f6d-4c1f-a9be-ad1444e1459e-config\") pod \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.348427 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afea7082-9f6d-4c1f-a9be-ad1444e1459e-serving-cert\") pod \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\" (UID: \"afea7082-9f6d-4c1f-a9be-ad1444e1459e\") " Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.348580 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.348693 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.348706 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qq6k\" (UniqueName: \"kubernetes.io/projected/7d79bc2b-a849-4d82-bc59-197431e014db-kube-api-access-4qq6k\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.348715 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d79bc2b-a849-4d82-bc59-197431e014db-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.348724 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.348732 4689 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d79bc2b-a849-4d82-bc59-197431e014db-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:07 crc kubenswrapper[4689]: E0307 04:23:07.349042 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:07.849029027 +0000 UTC m=+232.895412516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.351332 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afea7082-9f6d-4c1f-a9be-ad1444e1459e-client-ca" (OuterVolumeSpecName: "client-ca") pod "afea7082-9f6d-4c1f-a9be-ad1444e1459e" (UID: "afea7082-9f6d-4c1f-a9be-ad1444e1459e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.351949 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afea7082-9f6d-4c1f-a9be-ad1444e1459e-config" (OuterVolumeSpecName: "config") pod "afea7082-9f6d-4c1f-a9be-ad1444e1459e" (UID: "afea7082-9f6d-4c1f-a9be-ad1444e1459e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.370066 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afea7082-9f6d-4c1f-a9be-ad1444e1459e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "afea7082-9f6d-4c1f-a9be-ad1444e1459e" (UID: "afea7082-9f6d-4c1f-a9be-ad1444e1459e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.370585 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afea7082-9f6d-4c1f-a9be-ad1444e1459e-kube-api-access-h52md" (OuterVolumeSpecName: "kube-api-access-h52md") pod "afea7082-9f6d-4c1f-a9be-ad1444e1459e" (UID: "afea7082-9f6d-4c1f-a9be-ad1444e1459e"). InnerVolumeSpecName "kube-api-access-h52md". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.453204 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.453882 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afea7082-9f6d-4c1f-a9be-ad1444e1459e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.453893 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afea7082-9f6d-4c1f-a9be-ad1444e1459e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.453902 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h52md\" (UniqueName: \"kubernetes.io/projected/afea7082-9f6d-4c1f-a9be-ad1444e1459e-kube-api-access-h52md\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.453910 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afea7082-9f6d-4c1f-a9be-ad1444e1459e-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:07 crc kubenswrapper[4689]: E0307 04:23:07.453975 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:07.953959478 +0000 UTC m=+233.000342967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.455492 4689 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.555740 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:07 crc kubenswrapper[4689]: E0307 04:23:07.556199 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:08.056147447 +0000 UTC m=+233.102530936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.572710 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvrwc"] Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.598125 4689 generic.go:334] "Generic (PLEG): container finished" podID="afea7082-9f6d-4c1f-a9be-ad1444e1459e" containerID="6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d" exitCode=0 Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.598437 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" event={"ID":"afea7082-9f6d-4c1f-a9be-ad1444e1459e","Type":"ContainerDied","Data":"6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d"} Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.598486 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" event={"ID":"afea7082-9f6d-4c1f-a9be-ad1444e1459e","Type":"ContainerDied","Data":"98e3c81c69dc17318e621cbd401420fb2cfae1b14a8ba8fb0b629a453883580b"} Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.598505 4689 scope.go:117] "RemoveContainer" containerID="6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.598669 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.611159 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" event={"ID":"838dc182-e289-4769-98b0-e76ad62793c1","Type":"ContainerStarted","Data":"32a484e054fd2a278fbff03114c0e5b5e7c600c516ee19cecae61ee9fcbc94fb"} Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.613825 4689 generic.go:334] "Generic (PLEG): container finished" podID="7d79bc2b-a849-4d82-bc59-197431e014db" containerID="673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd" exitCode=0 Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.613868 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" event={"ID":"7d79bc2b-a849-4d82-bc59-197431e014db","Type":"ContainerDied","Data":"673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd"} Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.613885 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" event={"ID":"7d79bc2b-a849-4d82-bc59-197431e014db","Type":"ContainerDied","Data":"cfad0d99aa095414e88a7d6d2dd312e60da41fdd0c9b677431e98d15621fcc7e"} Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.613940 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9x8l6" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.631147 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmghp" event={"ID":"c82c3040-48ed-473b-9386-d58d13364f29","Type":"ContainerStarted","Data":"f1211fd9ea075098f905d994f01f104a763b223a5e1c3297fc9cd8dacd6275f5"} Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.631197 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmghp" event={"ID":"c82c3040-48ed-473b-9386-d58d13364f29","Type":"ContainerStarted","Data":"6b09126c0edaa1f93a11d78c4a374f15c6f1c1d644b4cf7d6c40b284ad01e9b0"} Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.656779 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:07 crc kubenswrapper[4689]: E0307 04:23:07.657077 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:08.157060622 +0000 UTC m=+233.203444111 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.708831 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk"] Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.711567 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mqfk"] Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.719987 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9x8l6"] Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.723276 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9x8l6"] Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.750360 4689 scope.go:117] "RemoveContainer" containerID="6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d" Mar 07 04:23:07 crc kubenswrapper[4689]: E0307 04:23:07.757383 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d\": container with ID starting with 6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d not found: ID does not exist" containerID="6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.757478 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d"} err="failed to get container status \"6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d\": rpc error: code = NotFound desc = could not find container \"6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d\": container with ID starting with 6e04d0b460f962a7b34c1f38215ebaf8c1b97e6d841e3508a02422367adab63d not found: ID does not exist" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.757518 4689 scope.go:117] "RemoveContainer" containerID="673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.758164 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:07 crc kubenswrapper[4689]: E0307 04:23:07.761093 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:08.261066088 +0000 UTC m=+233.307449577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.769138 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-chw2s"] Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.788102 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gc2hb"] Mar 07 04:23:07 crc kubenswrapper[4689]: W0307 04:23:07.799629 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4a365d2_d74f_4675_b789_27bafa93fbff.slice/crio-1e80c58b90f87062395443f36ee703eab04f03f064addb6586c8934f7e8c6957 WatchSource:0}: Error finding container 1e80c58b90f87062395443f36ee703eab04f03f064addb6586c8934f7e8c6957: Status 404 returned error can't find the container with id 1e80c58b90f87062395443f36ee703eab04f03f064addb6586c8934f7e8c6957 Mar 07 04:23:07 crc kubenswrapper[4689]: W0307 04:23:07.801547 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99bbfad4_6baf_4ada_88b8_158f49957da5.slice/crio-3538b87fae878bdd5727fcccb1b1f79c960f83b3fcd0fdf7acd97bbe8402b3ef WatchSource:0}: Error finding container 3538b87fae878bdd5727fcccb1b1f79c960f83b3fcd0fdf7acd97bbe8402b3ef: Status 404 returned error can't find the container with id 3538b87fae878bdd5727fcccb1b1f79c960f83b3fcd0fdf7acd97bbe8402b3ef Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.847657 4689 scope.go:117] "RemoveContainer" containerID="673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.848391 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d79bc2b-a849-4d82-bc59-197431e014db" path="/var/lib/kubelet/pods/7d79bc2b-a849-4d82-bc59-197431e014db/volumes" Mar 07 04:23:07 crc kubenswrapper[4689]: E0307 04:23:07.848434 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd\": container with ID starting with 673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd not found: ID does not exist" containerID="673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.848469 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd"} err="failed to get container status \"673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd\": rpc error: code = NotFound desc = could not find container \"673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd\": container with ID starting with 673d7a486ee327dec473934c4757a405b7ba805cb5ce6e264fd2f546057a37dd not found: ID does not exist" Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.849878 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afea7082-9f6d-4c1f-a9be-ad1444e1459e" path="/var/lib/kubelet/pods/afea7082-9f6d-4c1f-a9be-ad1444e1459e/volumes" Mar 07 04:23:07 crc kubenswrapper[4689]: W0307 04:23:07.855910 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-063cd9a82b1cdeb053077a35bbafd8d3075e3608178ad99e3472e2505d8d981f WatchSource:0}: Error finding container 063cd9a82b1cdeb053077a35bbafd8d3075e3608178ad99e3472e2505d8d981f: Status 404 returned error can't find the container with id 063cd9a82b1cdeb053077a35bbafd8d3075e3608178ad99e3472e2505d8d981f Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.862405 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:07 crc kubenswrapper[4689]: E0307 04:23:07.863291 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:08.363249987 +0000 UTC m=+233.409633476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:07 crc kubenswrapper[4689]: I0307 04:23:07.965573 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:07 crc kubenswrapper[4689]: E0307 04:23:07.966055 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 04:23:08.466032901 +0000 UTC m=+233.512416390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4cbc9" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.022938 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z"] Mar 07 04:23:08 crc kubenswrapper[4689]: E0307 04:23:08.023552 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d79bc2b-a849-4d82-bc59-197431e014db" containerName="controller-manager" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.023635 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d79bc2b-a849-4d82-bc59-197431e014db" containerName="controller-manager" Mar 07 04:23:08 crc kubenswrapper[4689]: E0307 04:23:08.023694 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afea7082-9f6d-4c1f-a9be-ad1444e1459e" containerName="route-controller-manager" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.023746 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="afea7082-9f6d-4c1f-a9be-ad1444e1459e" containerName="route-controller-manager" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.023921 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d79bc2b-a849-4d82-bc59-197431e014db" containerName="controller-manager" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.024023 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="afea7082-9f6d-4c1f-a9be-ad1444e1459e" containerName="route-controller-manager" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.024497 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.035232 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.035300 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.035615 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.041794 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.042132 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.043129 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.047349 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z"] Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.050941 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-656dcd75f-psjv4"] Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.051626 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.061822 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.062039 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.062193 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.062432 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.065521 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.066194 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.066480 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wctdr\" (UniqueName: \"kubernetes.io/projected/34067b0e-80e0-4d04-813a-0123a7914777-kube-api-access-wctdr\") pod \"route-controller-manager-5d75df64b8-nlp7z\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.066520 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34067b0e-80e0-4d04-813a-0123a7914777-config\") pod \"route-controller-manager-5d75df64b8-nlp7z\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.066546 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34067b0e-80e0-4d04-813a-0123a7914777-serving-cert\") pod \"route-controller-manager-5d75df64b8-nlp7z\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.066577 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34067b0e-80e0-4d04-813a-0123a7914777-client-ca\") pod \"route-controller-manager-5d75df64b8-nlp7z\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: E0307 04:23:08.066708 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 04:23:08.56669042 +0000 UTC m=+233.613073909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.068928 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.076421 4689 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-07T04:23:07.455515139Z","Handler":null,"Name":""} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.076742 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.112498 4689 patch_prober.go:28] interesting pod/router-default-5444994796-7dvxk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 04:23:08 crc kubenswrapper[4689]: [-]has-synced failed: reason withheld Mar 07 04:23:08 crc kubenswrapper[4689]: [+]process-running ok Mar 07 04:23:08 crc kubenswrapper[4689]: healthz check failed Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.114010 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7dvxk" podUID="3f4cf0c7-db05-4fc8-b538-199d3d4a4824" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.137919 4689 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.137955 4689 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.139699 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-656dcd75f-psjv4"] Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.167573 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-client-ca\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.167734 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-proxy-ca-bundles\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.167898 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wctdr\" (UniqueName: \"kubernetes.io/projected/34067b0e-80e0-4d04-813a-0123a7914777-kube-api-access-wctdr\") pod \"route-controller-manager-5d75df64b8-nlp7z\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.167999 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcl58\" (UniqueName: \"kubernetes.io/projected/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-kube-api-access-pcl58\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.168119 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34067b0e-80e0-4d04-813a-0123a7914777-config\") pod \"route-controller-manager-5d75df64b8-nlp7z\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.168242 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34067b0e-80e0-4d04-813a-0123a7914777-serving-cert\") pod \"route-controller-manager-5d75df64b8-nlp7z\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.168339 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-serving-cert\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.168415 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34067b0e-80e0-4d04-813a-0123a7914777-client-ca\") pod \"route-controller-manager-5d75df64b8-nlp7z\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.168498 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-config\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.168591 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.169216 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34067b0e-80e0-4d04-813a-0123a7914777-config\") pod \"route-controller-manager-5d75df64b8-nlp7z\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.169840 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34067b0e-80e0-4d04-813a-0123a7914777-client-ca\") pod \"route-controller-manager-5d75df64b8-nlp7z\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.170930 4689 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.170953 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.172910 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34067b0e-80e0-4d04-813a-0123a7914777-serving-cert\") pod \"route-controller-manager-5d75df64b8-nlp7z\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.187535 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wctdr\" (UniqueName: \"kubernetes.io/projected/34067b0e-80e0-4d04-813a-0123a7914777-kube-api-access-wctdr\") pod \"route-controller-manager-5d75df64b8-nlp7z\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.261673 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4cbc9\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.269352 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.269650 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-client-ca\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.269673 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-proxy-ca-bundles\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.269725 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcl58\" (UniqueName: \"kubernetes.io/projected/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-kube-api-access-pcl58\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.269766 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-serving-cert\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.269787 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-config\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.271371 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-client-ca\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.271683 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-proxy-ca-bundles\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.272675 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-config\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.273854 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.291955 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tgr9z"] Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.293963 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.294433 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-serving-cert\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.296568 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.296970 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcl58\" (UniqueName: \"kubernetes.io/projected/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-kube-api-access-pcl58\") pod \"controller-manager-656dcd75f-psjv4\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.307700 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgr9z"] Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.371350 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-utilities\") pod \"redhat-marketplace-tgr9z\" (UID: \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\") " pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.371421 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-catalog-content\") pod \"redhat-marketplace-tgr9z\" (UID: \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\") " pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.371456 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cptc\" (UniqueName: \"kubernetes.io/projected/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-kube-api-access-4cptc\") pod \"redhat-marketplace-tgr9z\" (UID: \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\") " pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.414300 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.415197 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-656dcd75f-psjv4"] Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.420335 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.440757 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.487087 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-catalog-content\") pod \"redhat-marketplace-tgr9z\" (UID: \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\") " pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.487193 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cptc\" (UniqueName: \"kubernetes.io/projected/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-kube-api-access-4cptc\") pod \"redhat-marketplace-tgr9z\" (UID: \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\") " pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.487283 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-utilities\") pod \"redhat-marketplace-tgr9z\" (UID: \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\") " pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.488769 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-catalog-content\") pod \"redhat-marketplace-tgr9z\" (UID: \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\") " pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.488808 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-utilities\") pod \"redhat-marketplace-tgr9z\" (UID: \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\") " pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.525803 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cptc\" (UniqueName: \"kubernetes.io/projected/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-kube-api-access-4cptc\") pod \"redhat-marketplace-tgr9z\" (UID: \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\") " pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.549270 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.684702 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-72r56"] Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.686352 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.687493 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-72r56"] Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.718162 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" event={"ID":"838dc182-e289-4769-98b0-e76ad62793c1","Type":"ContainerStarted","Data":"8a82e77d9feb9a70b0aa05522c67083a7045d942b3bcf2a861aabec47713bfb7"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.718234 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" event={"ID":"838dc182-e289-4769-98b0-e76ad62793c1","Type":"ContainerStarted","Data":"47d673bcf60f48e3463f36962fc3d14bedb9fd8118ff925f47e6a21401f1a190"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.752223 4689 generic.go:334] "Generic (PLEG): container finished" podID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" containerID="74a5bd36ad9cbee2b5db5dc80ee5f3d0aa4c1e2609587afa354012efcdd3e6f4" exitCode=0 Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.752246 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-c6r5s" podStartSLOduration=11.7521506 podStartE2EDuration="11.7521506s" podCreationTimestamp="2026-03-07 04:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:08.746660265 +0000 UTC m=+233.793043754" watchObservedRunningTime="2026-03-07 04:23:08.7521506 +0000 UTC m=+233.798534089" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.752352 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvrwc" event={"ID":"fd0c8e82-4247-4dbb-b1a5-4a258259199c","Type":"ContainerDied","Data":"74a5bd36ad9cbee2b5db5dc80ee5f3d0aa4c1e2609587afa354012efcdd3e6f4"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.752392 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvrwc" event={"ID":"fd0c8e82-4247-4dbb-b1a5-4a258259199c","Type":"ContainerStarted","Data":"cac016d6a5f03f781e64c5f0cdc3e8efc9459ca6c80cb65c99555e2aaafd121e"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.766434 4689 generic.go:334] "Generic (PLEG): container finished" podID="99bbfad4-6baf-4ada-88b8-158f49957da5" containerID="f6adef77b539dc706907b15d2309dc087fd0c7ce7b432e9f28b580d81736de27" exitCode=0 Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.767314 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chw2s" event={"ID":"99bbfad4-6baf-4ada-88b8-158f49957da5","Type":"ContainerDied","Data":"f6adef77b539dc706907b15d2309dc087fd0c7ce7b432e9f28b580d81736de27"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.767348 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chw2s" event={"ID":"99bbfad4-6baf-4ada-88b8-158f49957da5","Type":"ContainerStarted","Data":"3538b87fae878bdd5727fcccb1b1f79c960f83b3fcd0fdf7acd97bbe8402b3ef"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.778706 4689 generic.go:334] "Generic (PLEG): container finished" podID="320d5766-4cb7-4818-9072-86bfe7e7279d" containerID="34cf1dd6bba6fabfa972c6be4e0b3427b8e4b1f04fe98d75739ec72d98759d09" exitCode=0 Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.778811 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" event={"ID":"320d5766-4cb7-4818-9072-86bfe7e7279d","Type":"ContainerDied","Data":"34cf1dd6bba6fabfa972c6be4e0b3427b8e4b1f04fe98d75739ec72d98759d09"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.788920 4689 generic.go:334] "Generic (PLEG): container finished" podID="c82c3040-48ed-473b-9386-d58d13364f29" containerID="f1211fd9ea075098f905d994f01f104a763b223a5e1c3297fc9cd8dacd6275f5" exitCode=0 Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.789031 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmghp" event={"ID":"c82c3040-48ed-473b-9386-d58d13364f29","Type":"ContainerDied","Data":"f1211fd9ea075098f905d994f01f104a763b223a5e1c3297fc9cd8dacd6275f5"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.799331 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp2xw\" (UniqueName: \"kubernetes.io/projected/18622abe-0dae-4a1b-83b8-8314bf342ccc-kube-api-access-xp2xw\") pod \"redhat-marketplace-72r56\" (UID: \"18622abe-0dae-4a1b-83b8-8314bf342ccc\") " pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.799367 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18622abe-0dae-4a1b-83b8-8314bf342ccc-catalog-content\") pod \"redhat-marketplace-72r56\" (UID: \"18622abe-0dae-4a1b-83b8-8314bf342ccc\") " pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.799407 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18622abe-0dae-4a1b-83b8-8314bf342ccc-utilities\") pod \"redhat-marketplace-72r56\" (UID: \"18622abe-0dae-4a1b-83b8-8314bf342ccc\") " pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.801094 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"30421324b9910f1209ce3b6d3baa83d37a47b3dd2f835595225b6bbf851f2176"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.801125 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"184110cdd4c7308de4543004c8e493465a57d2bf084deb3655ef123245905b26"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.801543 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.815507 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"97ef5ba956017d78fd15c237cc820a15267769c3326a12affd4c38bd094f8fe5"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.815567 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4fdcfda231ad05034e999d350f84b91145fb2d7d43060c292705ac18b1411362"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.825083 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"36a0ee6d807707998f750ab6a198d8db50704b797769c06e9e05d8c6efb0f1e7"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.825132 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"063cd9a82b1cdeb053077a35bbafd8d3075e3608178ad99e3472e2505d8d981f"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.829739 4689 generic.go:334] "Generic (PLEG): container finished" podID="d4a365d2-d74f-4675-b789-27bafa93fbff" containerID="4f336cd13b96da657bb9c0b1073a8059dd29fba54ce5443167766adcbf8d3b49" exitCode=0 Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.830819 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gc2hb" event={"ID":"d4a365d2-d74f-4675-b789-27bafa93fbff","Type":"ContainerDied","Data":"4f336cd13b96da657bb9c0b1073a8059dd29fba54ce5443167766adcbf8d3b49"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.830837 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gc2hb" event={"ID":"d4a365d2-d74f-4675-b789-27bafa93fbff","Type":"ContainerStarted","Data":"1e80c58b90f87062395443f36ee703eab04f03f064addb6586c8934f7e8c6957"} Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.869603 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-656dcd75f-psjv4"] Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.904856 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18622abe-0dae-4a1b-83b8-8314bf342ccc-utilities\") pod \"redhat-marketplace-72r56\" (UID: \"18622abe-0dae-4a1b-83b8-8314bf342ccc\") " pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.905062 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp2xw\" (UniqueName: \"kubernetes.io/projected/18622abe-0dae-4a1b-83b8-8314bf342ccc-kube-api-access-xp2xw\") pod \"redhat-marketplace-72r56\" (UID: \"18622abe-0dae-4a1b-83b8-8314bf342ccc\") " pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.905083 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18622abe-0dae-4a1b-83b8-8314bf342ccc-catalog-content\") pod \"redhat-marketplace-72r56\" (UID: \"18622abe-0dae-4a1b-83b8-8314bf342ccc\") " pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.906043 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18622abe-0dae-4a1b-83b8-8314bf342ccc-utilities\") pod \"redhat-marketplace-72r56\" (UID: \"18622abe-0dae-4a1b-83b8-8314bf342ccc\") " pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.908410 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18622abe-0dae-4a1b-83b8-8314bf342ccc-catalog-content\") pod \"redhat-marketplace-72r56\" (UID: \"18622abe-0dae-4a1b-83b8-8314bf342ccc\") " pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.925771 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp2xw\" (UniqueName: \"kubernetes.io/projected/18622abe-0dae-4a1b-83b8-8314bf342ccc-kube-api-access-xp2xw\") pod \"redhat-marketplace-72r56\" (UID: \"18622abe-0dae-4a1b-83b8-8314bf342ccc\") " pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.940065 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgr9z"] Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.975209 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4cbc9"] Mar 07 04:23:08 crc kubenswrapper[4689]: W0307 04:23:08.981448 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec8159c9_c2bd_4af5_8b6b_b855bbd968a5.slice/crio-df30dd16acf27cd274071fd763c43d107b958ec2a40e5b89d7efd4ed889dca7c WatchSource:0}: Error finding container df30dd16acf27cd274071fd763c43d107b958ec2a40e5b89d7efd4ed889dca7c: Status 404 returned error can't find the container with id df30dd16acf27cd274071fd763c43d107b958ec2a40e5b89d7efd4ed889dca7c Mar 07 04:23:08 crc kubenswrapper[4689]: I0307 04:23:08.992971 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z"] Mar 07 04:23:08 crc kubenswrapper[4689]: W0307 04:23:08.997770 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60af193a_2553_4f45_b190_c86e1e3594e1.slice/crio-2fae2282f70c1c463259dbfb54aa4e1f56dfacda6c7a9bd04e1536eb48388c16 WatchSource:0}: Error finding container 2fae2282f70c1c463259dbfb54aa4e1f56dfacda6c7a9bd04e1536eb48388c16: Status 404 returned error can't find the container with id 2fae2282f70c1c463259dbfb54aa4e1f56dfacda6c7a9bd04e1536eb48388c16 Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.015572 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:23:09 crc kubenswrapper[4689]: W0307 04:23:09.044217 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34067b0e_80e0_4d04_813a_0123a7914777.slice/crio-7b6a46435158e52176f81cc6e457b877550f657342e327d060b47a3fbf4e70c7 WatchSource:0}: Error finding container 7b6a46435158e52176f81cc6e457b877550f657342e327d060b47a3fbf4e70c7: Status 404 returned error can't find the container with id 7b6a46435158e52176f81cc6e457b877550f657342e327d060b47a3fbf4e70c7 Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.101888 4689 patch_prober.go:28] interesting pod/router-default-5444994796-7dvxk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 04:23:09 crc kubenswrapper[4689]: [-]has-synced failed: reason withheld Mar 07 04:23:09 crc kubenswrapper[4689]: [+]process-running ok Mar 07 04:23:09 crc kubenswrapper[4689]: healthz check failed Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.101947 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7dvxk" podUID="3f4cf0c7-db05-4fc8-b538-199d3d4a4824" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.198659 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.199347 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.208631 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.208638 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.220446 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.290976 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2wh2s"] Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.292578 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.295064 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.296929 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2wh2s"] Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.322048 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34e2a5a6-6e4a-4d6e-8759-73a1c8543d91-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.322086 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34e2a5a6-6e4a-4d6e-8759-73a1c8543d91-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.386501 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-prpp8" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.401589 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.403186 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.413289 4689 patch_prober.go:28] interesting pod/console-f9d7485db-j4z8p container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.22:8443/health\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.413338 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-j4z8p" podUID="5d0f9cf7-c781-4964-a714-bcd780e88285" containerName="console" probeResult="failure" output="Get \"https://10.217.0.22:8443/health\": dial tcp 10.217.0.22:8443: connect: connection refused" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.424327 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98a53e64-9323-454c-9de0-a8d348182a64-utilities\") pod \"redhat-operators-2wh2s\" (UID: \"98a53e64-9323-454c-9de0-a8d348182a64\") " pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.424408 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98a53e64-9323-454c-9de0-a8d348182a64-catalog-content\") pod \"redhat-operators-2wh2s\" (UID: \"98a53e64-9323-454c-9de0-a8d348182a64\") " pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.424429 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwdzg\" (UniqueName: \"kubernetes.io/projected/98a53e64-9323-454c-9de0-a8d348182a64-kube-api-access-kwdzg\") pod \"redhat-operators-2wh2s\" (UID: \"98a53e64-9323-454c-9de0-a8d348182a64\") " pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.424458 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34e2a5a6-6e4a-4d6e-8759-73a1c8543d91-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.424482 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34e2a5a6-6e4a-4d6e-8759-73a1c8543d91-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.424818 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34e2a5a6-6e4a-4d6e-8759-73a1c8543d91-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.485312 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34e2a5a6-6e4a-4d6e-8759-73a1c8543d91-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.489220 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-72r56"] Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.529374 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98a53e64-9323-454c-9de0-a8d348182a64-catalog-content\") pod \"redhat-operators-2wh2s\" (UID: \"98a53e64-9323-454c-9de0-a8d348182a64\") " pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.529434 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwdzg\" (UniqueName: \"kubernetes.io/projected/98a53e64-9323-454c-9de0-a8d348182a64-kube-api-access-kwdzg\") pod \"redhat-operators-2wh2s\" (UID: \"98a53e64-9323-454c-9de0-a8d348182a64\") " pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.529545 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98a53e64-9323-454c-9de0-a8d348182a64-utilities\") pod \"redhat-operators-2wh2s\" (UID: \"98a53e64-9323-454c-9de0-a8d348182a64\") " pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.531351 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98a53e64-9323-454c-9de0-a8d348182a64-catalog-content\") pod \"redhat-operators-2wh2s\" (UID: \"98a53e64-9323-454c-9de0-a8d348182a64\") " pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.532996 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98a53e64-9323-454c-9de0-a8d348182a64-utilities\") pod \"redhat-operators-2wh2s\" (UID: \"98a53e64-9323-454c-9de0-a8d348182a64\") " pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.574042 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwdzg\" (UniqueName: \"kubernetes.io/projected/98a53e64-9323-454c-9de0-a8d348182a64-kube-api-access-kwdzg\") pod \"redhat-operators-2wh2s\" (UID: \"98a53e64-9323-454c-9de0-a8d348182a64\") " pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.591522 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.595821 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.596894 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-h6hq2" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.632469 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.680699 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5fhcx"] Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.685629 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.728344 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fhcx"] Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.737913 4689 ???:1] "http: TLS handshake error from 192.168.126.11:39410: no serving certificate available for the kubelet" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.748639 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kfv4\" (UniqueName: \"kubernetes.io/projected/b84afefb-ca8f-4586-a7bc-6d733cb723b1-kube-api-access-2kfv4\") pod \"redhat-operators-5fhcx\" (UID: \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\") " pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.748768 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84afefb-ca8f-4586-a7bc-6d733cb723b1-utilities\") pod \"redhat-operators-5fhcx\" (UID: \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\") " pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.748908 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84afefb-ca8f-4586-a7bc-6d733cb723b1-catalog-content\") pod \"redhat-operators-5fhcx\" (UID: \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\") " pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.857750 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.865154 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kfv4\" (UniqueName: \"kubernetes.io/projected/b84afefb-ca8f-4586-a7bc-6d733cb723b1-kube-api-access-2kfv4\") pod \"redhat-operators-5fhcx\" (UID: \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\") " pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.865260 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84afefb-ca8f-4586-a7bc-6d733cb723b1-utilities\") pod \"redhat-operators-5fhcx\" (UID: \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\") " pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.865304 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84afefb-ca8f-4586-a7bc-6d733cb723b1-catalog-content\") pod \"redhat-operators-5fhcx\" (UID: \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\") " pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.866270 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84afefb-ca8f-4586-a7bc-6d733cb723b1-utilities\") pod \"redhat-operators-5fhcx\" (UID: \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\") " pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.868209 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84afefb-ca8f-4586-a7bc-6d733cb723b1-catalog-content\") pod \"redhat-operators-5fhcx\" (UID: \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\") " pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.870968 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" event={"ID":"34067b0e-80e0-4d04-813a-0123a7914777","Type":"ContainerStarted","Data":"28e082c2442371d7cfe04bd33ee338fb71b5329793e7fb4635839bbbc1056c1f"} Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.871029 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" event={"ID":"34067b0e-80e0-4d04-813a-0123a7914777","Type":"ContainerStarted","Data":"7b6a46435158e52176f81cc6e457b877550f657342e327d060b47a3fbf4e70c7"} Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.871577 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.891849 4689 generic.go:334] "Generic (PLEG): container finished" podID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" containerID="91b958974b11a0c2309684b1fcb2ec6cca548d20a31245933502d28061c0d57f" exitCode=0 Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.891934 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgr9z" event={"ID":"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5","Type":"ContainerDied","Data":"91b958974b11a0c2309684b1fcb2ec6cca548d20a31245933502d28061c0d57f"} Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.891967 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgr9z" event={"ID":"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5","Type":"ContainerStarted","Data":"df30dd16acf27cd274071fd763c43d107b958ec2a40e5b89d7efd4ed889dca7c"} Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.897809 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.897848 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.909443 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kfv4\" (UniqueName: \"kubernetes.io/projected/b84afefb-ca8f-4586-a7bc-6d733cb723b1-kube-api-access-2kfv4\") pod \"redhat-operators-5fhcx\" (UID: \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\") " pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.910324 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" event={"ID":"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad","Type":"ContainerStarted","Data":"58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809"} Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.910368 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" event={"ID":"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad","Type":"ContainerStarted","Data":"1a28d68b6ed33e17c13e2f69c45d029656b7bfcf7574d4de0712bfd52f6f08c0"} Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.910368 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" podUID="1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad" containerName="controller-manager" containerID="cri-o://58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809" gracePeriod=30 Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.910641 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.913569 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" podStartSLOduration=3.9135520550000003 podStartE2EDuration="3.913552055s" podCreationTimestamp="2026-03-07 04:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:09.894575128 +0000 UTC m=+234.940958637" watchObservedRunningTime="2026-03-07 04:23:09.913552055 +0000 UTC m=+234.959935534" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.922283 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.931372 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.939243 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" event={"ID":"60af193a-2553-4f45-b190-c86e1e3594e1","Type":"ContainerStarted","Data":"ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09"} Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.939280 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" event={"ID":"60af193a-2553-4f45-b190-c86e1e3594e1","Type":"ContainerStarted","Data":"2fae2282f70c1c463259dbfb54aa4e1f56dfacda6c7a9bd04e1536eb48388c16"} Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.940036 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.944441 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72r56" event={"ID":"18622abe-0dae-4a1b-83b8-8314bf342ccc","Type":"ContainerStarted","Data":"bc50de807dd4ed87fc796639e8e02d6a9441f37dc221c658fdef9c162750ed3d"} Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.947344 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" podStartSLOduration=3.94731756 podStartE2EDuration="3.94731756s" podCreationTimestamp="2026-03-07 04:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:09.944148147 +0000 UTC m=+234.990531636" watchObservedRunningTime="2026-03-07 04:23:09.94731756 +0000 UTC m=+234.993701049" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.950192 4689 patch_prober.go:28] interesting pod/downloads-7954f5f757-nnnmk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.950232 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nnnmk" podUID="423b5174-7bed-4fba-af44-51abd9188676" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.950531 4689 patch_prober.go:28] interesting pod/downloads-7954f5f757-nnnmk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 07 04:23:09 crc kubenswrapper[4689]: I0307 04:23:09.950689 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nnnmk" podUID="423b5174-7bed-4fba-af44-51abd9188676" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.010345 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" podStartSLOduration=172.010326282 podStartE2EDuration="2m52.010326282s" podCreationTimestamp="2026-03-07 04:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:09.984975887 +0000 UTC m=+235.031359376" watchObservedRunningTime="2026-03-07 04:23:10.010326282 +0000 UTC m=+235.056709761" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.033708 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.054559 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.100587 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.105695 4689 patch_prober.go:28] interesting pod/router-default-5444994796-7dvxk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 04:23:10 crc kubenswrapper[4689]: [-]has-synced failed: reason withheld Mar 07 04:23:10 crc kubenswrapper[4689]: [+]process-running ok Mar 07 04:23:10 crc kubenswrapper[4689]: healthz check failed Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.105743 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7dvxk" podUID="3f4cf0c7-db05-4fc8-b538-199d3d4a4824" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.120841 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.125084 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.142391 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.153703 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.153924 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.163667 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.192819 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.280161 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5e7e7ff-bb38-4d50-8361-c84398f40fd5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e5e7e7ff-bb38-4d50-8361-c84398f40fd5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.280268 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5e7e7ff-bb38-4d50-8361-c84398f40fd5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e5e7e7ff-bb38-4d50-8361-c84398f40fd5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.356778 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2wh2s"] Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.385358 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5e7e7ff-bb38-4d50-8361-c84398f40fd5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e5e7e7ff-bb38-4d50-8361-c84398f40fd5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.385415 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5e7e7ff-bb38-4d50-8361-c84398f40fd5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e5e7e7ff-bb38-4d50-8361-c84398f40fd5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.385868 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5e7e7ff-bb38-4d50-8361-c84398f40fd5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e5e7e7ff-bb38-4d50-8361-c84398f40fd5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.421726 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5e7e7ff-bb38-4d50-8361-c84398f40fd5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e5e7e7ff-bb38-4d50-8361-c84398f40fd5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.508201 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.618994 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fhcx"] Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.713078 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.817485 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvnk5\" (UniqueName: \"kubernetes.io/projected/320d5766-4cb7-4818-9072-86bfe7e7279d-kube-api-access-pvnk5\") pod \"320d5766-4cb7-4818-9072-86bfe7e7279d\" (UID: \"320d5766-4cb7-4818-9072-86bfe7e7279d\") " Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.817624 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/320d5766-4cb7-4818-9072-86bfe7e7279d-config-volume\") pod \"320d5766-4cb7-4818-9072-86bfe7e7279d\" (UID: \"320d5766-4cb7-4818-9072-86bfe7e7279d\") " Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.817677 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/320d5766-4cb7-4818-9072-86bfe7e7279d-secret-volume\") pod \"320d5766-4cb7-4818-9072-86bfe7e7279d\" (UID: \"320d5766-4cb7-4818-9072-86bfe7e7279d\") " Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.820774 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/320d5766-4cb7-4818-9072-86bfe7e7279d-config-volume" (OuterVolumeSpecName: "config-volume") pod "320d5766-4cb7-4818-9072-86bfe7e7279d" (UID: "320d5766-4cb7-4818-9072-86bfe7e7279d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.824761 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320d5766-4cb7-4818-9072-86bfe7e7279d-kube-api-access-pvnk5" (OuterVolumeSpecName: "kube-api-access-pvnk5") pod "320d5766-4cb7-4818-9072-86bfe7e7279d" (UID: "320d5766-4cb7-4818-9072-86bfe7e7279d"). InnerVolumeSpecName "kube-api-access-pvnk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.829581 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320d5766-4cb7-4818-9072-86bfe7e7279d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "320d5766-4cb7-4818-9072-86bfe7e7279d" (UID: "320d5766-4cb7-4818-9072-86bfe7e7279d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.852498 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.920150 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvnk5\" (UniqueName: \"kubernetes.io/projected/320d5766-4cb7-4818-9072-86bfe7e7279d-kube-api-access-pvnk5\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.920208 4689 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/320d5766-4cb7-4818-9072-86bfe7e7279d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.920228 4689 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/320d5766-4cb7-4818-9072-86bfe7e7279d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.951563 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.995496 4689 generic.go:334] "Generic (PLEG): container finished" podID="18622abe-0dae-4a1b-83b8-8314bf342ccc" containerID="ee408d1b8417cd40313871a28ad8ef262987426ebc665f9b7bc4ebbe63cdd2af" exitCode=0 Mar 07 04:23:10 crc kubenswrapper[4689]: I0307 04:23:10.995604 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72r56" event={"ID":"18622abe-0dae-4a1b-83b8-8314bf342ccc","Type":"ContainerDied","Data":"ee408d1b8417cd40313871a28ad8ef262987426ebc665f9b7bc4ebbe63cdd2af"} Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.002558 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fhcx" event={"ID":"b84afefb-ca8f-4586-a7bc-6d733cb723b1","Type":"ContainerStarted","Data":"2fbcb8a08e1d31015539a5f486d69ebfc2f2531ddbf2cabf4650a5d65534301b"} Mar 07 04:23:11 crc kubenswrapper[4689]: W0307 04:23:11.007654 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode5e7e7ff_bb38_4d50_8361_c84398f40fd5.slice/crio-6cadd9228be21bfa71f1ea9f80eed31d830e75c6592df901d215bc770eafabf1 WatchSource:0}: Error finding container 6cadd9228be21bfa71f1ea9f80eed31d830e75c6592df901d215bc770eafabf1: Status 404 returned error can't find the container with id 6cadd9228be21bfa71f1ea9f80eed31d830e75c6592df901d215bc770eafabf1 Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.015059 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wh2s" event={"ID":"98a53e64-9323-454c-9de0-a8d348182a64","Type":"ContainerStarted","Data":"70200621582e8c6967da7ae5719b2d3dd116dc355d3f0d6614fc2d0bebc9fe7f"} Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.019262 4689 generic.go:334] "Generic (PLEG): container finished" podID="1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad" containerID="58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809" exitCode=0 Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.019325 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" event={"ID":"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad","Type":"ContainerDied","Data":"58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809"} Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.019349 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" event={"ID":"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad","Type":"ContainerDied","Data":"1a28d68b6ed33e17c13e2f69c45d029656b7bfcf7574d4de0712bfd52f6f08c0"} Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.019367 4689 scope.go:117] "RemoveContainer" containerID="58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.019475 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-656dcd75f-psjv4" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.020797 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-serving-cert\") pod \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.020992 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcl58\" (UniqueName: \"kubernetes.io/projected/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-kube-api-access-pcl58\") pod \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.021034 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-proxy-ca-bundles\") pod \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.021075 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-config\") pod \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.021148 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-client-ca\") pod \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\" (UID: \"1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad\") " Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.022230 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad" (UID: "1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.022274 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-config" (OuterVolumeSpecName: "config") pod "1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad" (UID: "1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.024198 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-client-ca" (OuterVolumeSpecName: "client-ca") pod "1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad" (UID: "1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.026350 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-kube-api-access-pcl58" (OuterVolumeSpecName: "kube-api-access-pcl58") pod "1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad" (UID: "1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad"). InnerVolumeSpecName "kube-api-access-pcl58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.028612 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad" (UID: "1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.045930 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.046002 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547615-6d5r5" event={"ID":"320d5766-4cb7-4818-9072-86bfe7e7279d","Type":"ContainerDied","Data":"e742d373c5f321123f07a4e3157100232d8a66c585f0649156c790295c1e9343"} Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.046081 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e742d373c5f321123f07a4e3157100232d8a66c585f0649156c790295c1e9343" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.054102 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91","Type":"ContainerStarted","Data":"0515049375f34325fe7a96c3f9b602d44ea30f43098df76d97ce9af081329498"} Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.063477 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4k6lm" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.110241 4689 patch_prober.go:28] interesting pod/router-default-5444994796-7dvxk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 04:23:11 crc kubenswrapper[4689]: [-]has-synced failed: reason withheld Mar 07 04:23:11 crc kubenswrapper[4689]: [+]process-running ok Mar 07 04:23:11 crc kubenswrapper[4689]: healthz check failed Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.110336 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7dvxk" podUID="3f4cf0c7-db05-4fc8-b538-199d3d4a4824" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.119106 4689 scope.go:117] "RemoveContainer" containerID="58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809" Mar 07 04:23:11 crc kubenswrapper[4689]: E0307 04:23:11.119725 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809\": container with ID starting with 58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809 not found: ID does not exist" containerID="58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.119778 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809"} err="failed to get container status \"58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809\": rpc error: code = NotFound desc = could not find container \"58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809\": container with ID starting with 58af85e62f0005b153687a53a1c56fde52a384cddbcb334f894cccf589e92809 not found: ID does not exist" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.122824 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcl58\" (UniqueName: \"kubernetes.io/projected/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-kube-api-access-pcl58\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.122886 4689 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.122896 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.122923 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.122931 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.214631 4689 ???:1] "http: TLS handshake error from 192.168.126.11:51828: no serving certificate available for the kubelet" Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.378916 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-656dcd75f-psjv4"] Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.378982 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-656dcd75f-psjv4"] Mar 07 04:23:11 crc kubenswrapper[4689]: I0307 04:23:11.842969 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad" path="/var/lib/kubelet/pods/1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad/volumes" Mar 07 04:23:12 crc kubenswrapper[4689]: I0307 04:23:12.072136 4689 generic.go:334] "Generic (PLEG): container finished" podID="34e2a5a6-6e4a-4d6e-8759-73a1c8543d91" containerID="63a2c33e71bc0f483e3bc33c98bc39ca4bfd1ea9a3322fe8653afe65560b713c" exitCode=0 Mar 07 04:23:12 crc kubenswrapper[4689]: I0307 04:23:12.072223 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91","Type":"ContainerDied","Data":"63a2c33e71bc0f483e3bc33c98bc39ca4bfd1ea9a3322fe8653afe65560b713c"} Mar 07 04:23:12 crc kubenswrapper[4689]: I0307 04:23:12.082791 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e5e7e7ff-bb38-4d50-8361-c84398f40fd5","Type":"ContainerStarted","Data":"6cadd9228be21bfa71f1ea9f80eed31d830e75c6592df901d215bc770eafabf1"} Mar 07 04:23:12 crc kubenswrapper[4689]: I0307 04:23:12.095571 4689 generic.go:334] "Generic (PLEG): container finished" podID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" containerID="41c587ecc10eca9dd4cbe0eef1bb567d64fee74a46374564312a04497e0f19a7" exitCode=0 Mar 07 04:23:12 crc kubenswrapper[4689]: I0307 04:23:12.095652 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fhcx" event={"ID":"b84afefb-ca8f-4586-a7bc-6d733cb723b1","Type":"ContainerDied","Data":"41c587ecc10eca9dd4cbe0eef1bb567d64fee74a46374564312a04497e0f19a7"} Mar 07 04:23:12 crc kubenswrapper[4689]: I0307 04:23:12.099613 4689 patch_prober.go:28] interesting pod/router-default-5444994796-7dvxk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 04:23:12 crc kubenswrapper[4689]: [-]has-synced failed: reason withheld Mar 07 04:23:12 crc kubenswrapper[4689]: [+]process-running ok Mar 07 04:23:12 crc kubenswrapper[4689]: healthz check failed Mar 07 04:23:12 crc kubenswrapper[4689]: I0307 04:23:12.099658 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7dvxk" podUID="3f4cf0c7-db05-4fc8-b538-199d3d4a4824" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 04:23:12 crc kubenswrapper[4689]: I0307 04:23:12.108678 4689 generic.go:334] "Generic (PLEG): container finished" podID="98a53e64-9323-454c-9de0-a8d348182a64" containerID="82c1bbc770916edd38d9cea02243cda9697bd464004b82f2544abae9b19d0f92" exitCode=0 Mar 07 04:23:12 crc kubenswrapper[4689]: I0307 04:23:12.108783 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wh2s" event={"ID":"98a53e64-9323-454c-9de0-a8d348182a64","Type":"ContainerDied","Data":"82c1bbc770916edd38d9cea02243cda9697bd464004b82f2544abae9b19d0f92"} Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.290861 4689 patch_prober.go:28] interesting pod/router-default-5444994796-7dvxk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 04:23:13 crc kubenswrapper[4689]: [-]has-synced failed: reason withheld Mar 07 04:23:13 crc kubenswrapper[4689]: [+]process-running ok Mar 07 04:23:13 crc kubenswrapper[4689]: healthz check failed Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.291632 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7dvxk" podUID="3f4cf0c7-db05-4fc8-b538-199d3d4a4824" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.306275 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-675db8c7f8-kd8kw"] Mar 07 04:23:13 crc kubenswrapper[4689]: E0307 04:23:13.306749 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad" containerName="controller-manager" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.306761 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad" containerName="controller-manager" Mar 07 04:23:13 crc kubenswrapper[4689]: E0307 04:23:13.306772 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320d5766-4cb7-4818-9072-86bfe7e7279d" containerName="collect-profiles" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.306788 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="320d5766-4cb7-4818-9072-86bfe7e7279d" containerName="collect-profiles" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.306987 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac9d062-b854-4a61-b3ef-abf5d1eeb3ad" containerName="controller-manager" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.307016 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="320d5766-4cb7-4818-9072-86bfe7e7279d" containerName="collect-profiles" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.307597 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.309459 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.309758 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e5e7e7ff-bb38-4d50-8361-c84398f40fd5","Type":"ContainerStarted","Data":"66b613eb355a3df5be6c3dd65a9e013872555619522379e1b38bc89cb3ecb984"} Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.311397 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.312116 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.312569 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-675db8c7f8-kd8kw"] Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.312649 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.312753 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.312983 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.319128 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.325976 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16e0e2e8-673a-446e-b377-f30ffd8edd1f-metrics-certs\") pod \"network-metrics-daemon-95vzv\" (UID: \"16e0e2e8-673a-446e-b377-f30ffd8edd1f\") " pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.338770 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.372845 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.372827669 podStartE2EDuration="3.372827669s" podCreationTimestamp="2026-03-07 04:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:13.366914244 +0000 UTC m=+238.413297733" watchObservedRunningTime="2026-03-07 04:23:13.372827669 +0000 UTC m=+238.419211148" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.398347 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95vzv" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.421489 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-client-ca\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.421551 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-config\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.421573 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl8tn\" (UniqueName: \"kubernetes.io/projected/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-kube-api-access-bl8tn\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.421610 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-serving-cert\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.421629 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-proxy-ca-bundles\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.524446 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-serving-cert\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.524492 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-proxy-ca-bundles\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.524582 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-client-ca\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.524638 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-config\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.524659 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl8tn\" (UniqueName: \"kubernetes.io/projected/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-kube-api-access-bl8tn\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.526906 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-client-ca\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.527700 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-proxy-ca-bundles\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.528727 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-config\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.531043 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-serving-cert\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.544355 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl8tn\" (UniqueName: \"kubernetes.io/projected/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-kube-api-access-bl8tn\") pod \"controller-manager-675db8c7f8-kd8kw\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.600869 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.689327 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.729809 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34e2a5a6-6e4a-4d6e-8759-73a1c8543d91-kube-api-access\") pod \"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91\" (UID: \"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91\") " Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.730086 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34e2a5a6-6e4a-4d6e-8759-73a1c8543d91-kubelet-dir\") pod \"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91\" (UID: \"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91\") " Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.730225 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34e2a5a6-6e4a-4d6e-8759-73a1c8543d91-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "34e2a5a6-6e4a-4d6e-8759-73a1c8543d91" (UID: "34e2a5a6-6e4a-4d6e-8759-73a1c8543d91"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.730513 4689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34e2a5a6-6e4a-4d6e-8759-73a1c8543d91-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.735914 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e2a5a6-6e4a-4d6e-8759-73a1c8543d91-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "34e2a5a6-6e4a-4d6e-8759-73a1c8543d91" (UID: "34e2a5a6-6e4a-4d6e-8759-73a1c8543d91"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.770444 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-95vzv"] Mar 07 04:23:13 crc kubenswrapper[4689]: W0307 04:23:13.790195 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16e0e2e8_673a_446e_b377_f30ffd8edd1f.slice/crio-5b9677a3eb9a97bf5b34f1a514f42474ad0a4b4832ef9825ec3e5ba958f82448 WatchSource:0}: Error finding container 5b9677a3eb9a97bf5b34f1a514f42474ad0a4b4832ef9825ec3e5ba958f82448: Status 404 returned error can't find the container with id 5b9677a3eb9a97bf5b34f1a514f42474ad0a4b4832ef9825ec3e5ba958f82448 Mar 07 04:23:13 crc kubenswrapper[4689]: I0307 04:23:13.832896 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34e2a5a6-6e4a-4d6e-8759-73a1c8543d91-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:14 crc kubenswrapper[4689]: I0307 04:23:14.103361 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:23:14 crc kubenswrapper[4689]: I0307 04:23:14.106483 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-7dvxk" Mar 07 04:23:14 crc kubenswrapper[4689]: I0307 04:23:14.241701 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-675db8c7f8-kd8kw"] Mar 07 04:23:14 crc kubenswrapper[4689]: W0307 04:23:14.297727 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ad83d1b_fe79_4eea_8bb5_8f8e9d50a79e.slice/crio-9f176a7db84d386b761318a2bcb56e2da5786d7604eaa09558605430c95e9b5c WatchSource:0}: Error finding container 9f176a7db84d386b761318a2bcb56e2da5786d7604eaa09558605430c95e9b5c: Status 404 returned error can't find the container with id 9f176a7db84d386b761318a2bcb56e2da5786d7604eaa09558605430c95e9b5c Mar 07 04:23:14 crc kubenswrapper[4689]: I0307 04:23:14.319582 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-95vzv" event={"ID":"16e0e2e8-673a-446e-b377-f30ffd8edd1f","Type":"ContainerStarted","Data":"5b9677a3eb9a97bf5b34f1a514f42474ad0a4b4832ef9825ec3e5ba958f82448"} Mar 07 04:23:14 crc kubenswrapper[4689]: I0307 04:23:14.322553 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" event={"ID":"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e","Type":"ContainerStarted","Data":"9f176a7db84d386b761318a2bcb56e2da5786d7604eaa09558605430c95e9b5c"} Mar 07 04:23:14 crc kubenswrapper[4689]: I0307 04:23:14.333888 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 04:23:14 crc kubenswrapper[4689]: I0307 04:23:14.333902 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"34e2a5a6-6e4a-4d6e-8759-73a1c8543d91","Type":"ContainerDied","Data":"0515049375f34325fe7a96c3f9b602d44ea30f43098df76d97ce9af081329498"} Mar 07 04:23:14 crc kubenswrapper[4689]: I0307 04:23:14.333968 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0515049375f34325fe7a96c3f9b602d44ea30f43098df76d97ce9af081329498" Mar 07 04:23:14 crc kubenswrapper[4689]: I0307 04:23:14.337800 4689 generic.go:334] "Generic (PLEG): container finished" podID="e5e7e7ff-bb38-4d50-8361-c84398f40fd5" containerID="66b613eb355a3df5be6c3dd65a9e013872555619522379e1b38bc89cb3ecb984" exitCode=0 Mar 07 04:23:14 crc kubenswrapper[4689]: I0307 04:23:14.338128 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e5e7e7ff-bb38-4d50-8361-c84398f40fd5","Type":"ContainerDied","Data":"66b613eb355a3df5be6c3dd65a9e013872555619522379e1b38bc89cb3ecb984"} Mar 07 04:23:14 crc kubenswrapper[4689]: I0307 04:23:14.895104 4689 ???:1] "http: TLS handshake error from 192.168.126.11:51832: no serving certificate available for the kubelet" Mar 07 04:23:15 crc kubenswrapper[4689]: I0307 04:23:15.256463 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8ls5c" Mar 07 04:23:15 crc kubenswrapper[4689]: I0307 04:23:15.383078 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" event={"ID":"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e","Type":"ContainerStarted","Data":"66e2812bb98574dceaf89a6944b6774aa06c33515ceebf80cdb13e53087e875f"} Mar 07 04:23:15 crc kubenswrapper[4689]: I0307 04:23:15.383546 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:15 crc kubenswrapper[4689]: I0307 04:23:15.388840 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-95vzv" event={"ID":"16e0e2e8-673a-446e-b377-f30ffd8edd1f","Type":"ContainerStarted","Data":"616578aec488aee9fc064c7511305049d0f3241a2d5c3a5f0bbb322b3f52d56c"} Mar 07 04:23:15 crc kubenswrapper[4689]: I0307 04:23:15.389924 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:15 crc kubenswrapper[4689]: I0307 04:23:15.405005 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" podStartSLOduration=7.404990121 podStartE2EDuration="7.404990121s" podCreationTimestamp="2026-03-07 04:23:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:15.400927034 +0000 UTC m=+240.447310523" watchObservedRunningTime="2026-03-07 04:23:15.404990121 +0000 UTC m=+240.451373610" Mar 07 04:23:19 crc kubenswrapper[4689]: I0307 04:23:19.462353 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:23:19 crc kubenswrapper[4689]: I0307 04:23:19.473785 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-j4z8p" Mar 07 04:23:19 crc kubenswrapper[4689]: I0307 04:23:19.950986 4689 patch_prober.go:28] interesting pod/downloads-7954f5f757-nnnmk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 07 04:23:19 crc kubenswrapper[4689]: I0307 04:23:19.951043 4689 patch_prober.go:28] interesting pod/downloads-7954f5f757-nnnmk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 07 04:23:19 crc kubenswrapper[4689]: I0307 04:23:19.951050 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nnnmk" podUID="423b5174-7bed-4fba-af44-51abd9188676" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 07 04:23:19 crc kubenswrapper[4689]: I0307 04:23:19.951094 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nnnmk" podUID="423b5174-7bed-4fba-af44-51abd9188676" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 07 04:23:25 crc kubenswrapper[4689]: I0307 04:23:25.647926 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-675db8c7f8-kd8kw"] Mar 07 04:23:25 crc kubenswrapper[4689]: I0307 04:23:25.649917 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" podUID="7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e" containerName="controller-manager" containerID="cri-o://66e2812bb98574dceaf89a6944b6774aa06c33515ceebf80cdb13e53087e875f" gracePeriod=30 Mar 07 04:23:25 crc kubenswrapper[4689]: I0307 04:23:25.657550 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z"] Mar 07 04:23:25 crc kubenswrapper[4689]: I0307 04:23:25.658859 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" podUID="34067b0e-80e0-4d04-813a-0123a7914777" containerName="route-controller-manager" containerID="cri-o://28e082c2442371d7cfe04bd33ee338fb71b5329793e7fb4635839bbbc1056c1f" gracePeriod=30 Mar 07 04:23:27 crc kubenswrapper[4689]: I0307 04:23:27.542152 4689 generic.go:334] "Generic (PLEG): container finished" podID="34067b0e-80e0-4d04-813a-0123a7914777" containerID="28e082c2442371d7cfe04bd33ee338fb71b5329793e7fb4635839bbbc1056c1f" exitCode=0 Mar 07 04:23:27 crc kubenswrapper[4689]: I0307 04:23:27.542359 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" event={"ID":"34067b0e-80e0-4d04-813a-0123a7914777","Type":"ContainerDied","Data":"28e082c2442371d7cfe04bd33ee338fb71b5329793e7fb4635839bbbc1056c1f"} Mar 07 04:23:28 crc kubenswrapper[4689]: I0307 04:23:28.415987 4689 patch_prober.go:28] interesting pod/route-controller-manager-5d75df64b8-nlp7z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Mar 07 04:23:28 crc kubenswrapper[4689]: I0307 04:23:28.416077 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" podUID="34067b0e-80e0-4d04-813a-0123a7914777" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Mar 07 04:23:28 crc kubenswrapper[4689]: I0307 04:23:28.450373 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:23:28 crc kubenswrapper[4689]: I0307 04:23:28.553864 4689 generic.go:334] "Generic (PLEG): container finished" podID="7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e" containerID="66e2812bb98574dceaf89a6944b6774aa06c33515ceebf80cdb13e53087e875f" exitCode=0 Mar 07 04:23:28 crc kubenswrapper[4689]: I0307 04:23:28.553931 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" event={"ID":"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e","Type":"ContainerDied","Data":"66e2812bb98574dceaf89a6944b6774aa06c33515ceebf80cdb13e53087e875f"} Mar 07 04:23:29 crc kubenswrapper[4689]: I0307 04:23:29.190064 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:23:29 crc kubenswrapper[4689]: I0307 04:23:29.190164 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:23:29 crc kubenswrapper[4689]: I0307 04:23:29.960463 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-nnnmk" Mar 07 04:23:33 crc kubenswrapper[4689]: I0307 04:23:33.671819 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 04:23:33 crc kubenswrapper[4689]: I0307 04:23:33.694818 4689 patch_prober.go:28] interesting pod/controller-manager-675db8c7f8-kd8kw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Mar 07 04:23:33 crc kubenswrapper[4689]: I0307 04:23:33.694891 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" podUID="7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Mar 07 04:23:33 crc kubenswrapper[4689]: I0307 04:23:33.770906 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5e7e7ff-bb38-4d50-8361-c84398f40fd5-kubelet-dir\") pod \"e5e7e7ff-bb38-4d50-8361-c84398f40fd5\" (UID: \"e5e7e7ff-bb38-4d50-8361-c84398f40fd5\") " Mar 07 04:23:33 crc kubenswrapper[4689]: I0307 04:23:33.771002 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5e7e7ff-bb38-4d50-8361-c84398f40fd5-kube-api-access\") pod \"e5e7e7ff-bb38-4d50-8361-c84398f40fd5\" (UID: \"e5e7e7ff-bb38-4d50-8361-c84398f40fd5\") " Mar 07 04:23:33 crc kubenswrapper[4689]: I0307 04:23:33.771142 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5e7e7ff-bb38-4d50-8361-c84398f40fd5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e5e7e7ff-bb38-4d50-8361-c84398f40fd5" (UID: "e5e7e7ff-bb38-4d50-8361-c84398f40fd5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:23:33 crc kubenswrapper[4689]: I0307 04:23:33.771875 4689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5e7e7ff-bb38-4d50-8361-c84398f40fd5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:33 crc kubenswrapper[4689]: I0307 04:23:33.780456 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e7e7ff-bb38-4d50-8361-c84398f40fd5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e5e7e7ff-bb38-4d50-8361-c84398f40fd5" (UID: "e5e7e7ff-bb38-4d50-8361-c84398f40fd5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:23:33 crc kubenswrapper[4689]: I0307 04:23:33.874274 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5e7e7ff-bb38-4d50-8361-c84398f40fd5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:34 crc kubenswrapper[4689]: I0307 04:23:34.627997 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e5e7e7ff-bb38-4d50-8361-c84398f40fd5","Type":"ContainerDied","Data":"6cadd9228be21bfa71f1ea9f80eed31d830e75c6592df901d215bc770eafabf1"} Mar 07 04:23:34 crc kubenswrapper[4689]: I0307 04:23:34.628159 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 04:23:34 crc kubenswrapper[4689]: I0307 04:23:34.628283 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cadd9228be21bfa71f1ea9f80eed31d830e75c6592df901d215bc770eafabf1" Mar 07 04:23:35 crc kubenswrapper[4689]: I0307 04:23:35.406845 4689 ???:1] "http: TLS handshake error from 192.168.126.11:56540: no serving certificate available for the kubelet" Mar 07 04:23:36 crc kubenswrapper[4689]: E0307 04:23:36.305143 4689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:7ec90947c5e42a6b363a181de1231271558968b64076f26200c96a020ef90893: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:7ec90947c5e42a6b363a181de1231271558968b64076f26200c96a020ef90893\": context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 07 04:23:36 crc kubenswrapper[4689]: E0307 04:23:36.305361 4689 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwdzg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-2wh2s_openshift-marketplace(98a53e64-9323-454c-9de0-a8d348182a64): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:7ec90947c5e42a6b363a181de1231271558968b64076f26200c96a020ef90893: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:7ec90947c5e42a6b363a181de1231271558968b64076f26200c96a020ef90893\": context canceled" logger="UnhandledError" Mar 07 04:23:36 crc kubenswrapper[4689]: E0307 04:23:36.307093 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:7ec90947c5e42a6b363a181de1231271558968b64076f26200c96a020ef90893: Get \\\"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:7ec90947c5e42a6b363a181de1231271558968b64076f26200c96a020ef90893\\\": context canceled\"" pod="openshift-marketplace/redhat-operators-2wh2s" podUID="98a53e64-9323-454c-9de0-a8d348182a64" Mar 07 04:23:36 crc kubenswrapper[4689]: E0307 04:23:36.312703 4689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 07 04:23:36 crc kubenswrapper[4689]: E0307 04:23:36.312807 4689 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 04:23:36 crc kubenswrapper[4689]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 07 04:23:36 crc kubenswrapper[4689]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fnhp7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29547622-4796h_openshift-infra(33a94bd2-f479-403b-9c36-a708410864aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 07 04:23:36 crc kubenswrapper[4689]: > logger="UnhandledError" Mar 07 04:23:36 crc kubenswrapper[4689]: E0307 04:23:36.313948 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29547622-4796h" podUID="33a94bd2-f479-403b-9c36-a708410864aa" Mar 07 04:23:36 crc kubenswrapper[4689]: E0307 04:23:36.643905 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29547622-4796h" podUID="33a94bd2-f479-403b-9c36-a708410864aa" Mar 07 04:23:39 crc kubenswrapper[4689]: E0307 04:23:39.145368 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-2wh2s" podUID="98a53e64-9323-454c-9de0-a8d348182a64" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.214335 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.218837 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.251772 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f98cd78bf-4hnzf"] Mar 07 04:23:39 crc kubenswrapper[4689]: E0307 04:23:39.252037 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e" containerName="controller-manager" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.252049 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e" containerName="controller-manager" Mar 07 04:23:39 crc kubenswrapper[4689]: E0307 04:23:39.252066 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e2a5a6-6e4a-4d6e-8759-73a1c8543d91" containerName="pruner" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.252073 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e2a5a6-6e4a-4d6e-8759-73a1c8543d91" containerName="pruner" Mar 07 04:23:39 crc kubenswrapper[4689]: E0307 04:23:39.252082 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34067b0e-80e0-4d04-813a-0123a7914777" containerName="route-controller-manager" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.252088 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="34067b0e-80e0-4d04-813a-0123a7914777" containerName="route-controller-manager" Mar 07 04:23:39 crc kubenswrapper[4689]: E0307 04:23:39.252097 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e7e7ff-bb38-4d50-8361-c84398f40fd5" containerName="pruner" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.252102 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e7e7ff-bb38-4d50-8361-c84398f40fd5" containerName="pruner" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.252205 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e7e7ff-bb38-4d50-8361-c84398f40fd5" containerName="pruner" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.252216 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e" containerName="controller-manager" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.252223 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="34067b0e-80e0-4d04-813a-0123a7914777" containerName="route-controller-manager" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.252234 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e2a5a6-6e4a-4d6e-8759-73a1c8543d91" containerName="pruner" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.254358 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.259254 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f98cd78bf-4hnzf"] Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.361786 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl8tn\" (UniqueName: \"kubernetes.io/projected/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-kube-api-access-bl8tn\") pod \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.361853 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-serving-cert\") pod \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.361888 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-config\") pod \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.361907 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-proxy-ca-bundles\") pod \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.361943 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34067b0e-80e0-4d04-813a-0123a7914777-client-ca\") pod \"34067b0e-80e0-4d04-813a-0123a7914777\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.361972 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-client-ca\") pod \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\" (UID: \"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e\") " Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.362043 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34067b0e-80e0-4d04-813a-0123a7914777-serving-cert\") pod \"34067b0e-80e0-4d04-813a-0123a7914777\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.362083 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wctdr\" (UniqueName: \"kubernetes.io/projected/34067b0e-80e0-4d04-813a-0123a7914777-kube-api-access-wctdr\") pod \"34067b0e-80e0-4d04-813a-0123a7914777\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.362122 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34067b0e-80e0-4d04-813a-0123a7914777-config\") pod \"34067b0e-80e0-4d04-813a-0123a7914777\" (UID: \"34067b0e-80e0-4d04-813a-0123a7914777\") " Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.362306 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcvjc\" (UniqueName: \"kubernetes.io/projected/e7f162fe-358d-4f03-833d-f7ce79ddad14-kube-api-access-gcvjc\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.362342 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-client-ca\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.362358 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-proxy-ca-bundles\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.362378 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f162fe-358d-4f03-833d-f7ce79ddad14-serving-cert\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.362399 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-config\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.363358 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-client-ca" (OuterVolumeSpecName: "client-ca") pod "7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e" (UID: "7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.363378 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e" (UID: "7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.363433 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-config" (OuterVolumeSpecName: "config") pod "7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e" (UID: "7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.363487 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34067b0e-80e0-4d04-813a-0123a7914777-client-ca" (OuterVolumeSpecName: "client-ca") pod "34067b0e-80e0-4d04-813a-0123a7914777" (UID: "34067b0e-80e0-4d04-813a-0123a7914777"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.363564 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34067b0e-80e0-4d04-813a-0123a7914777-config" (OuterVolumeSpecName: "config") pod "34067b0e-80e0-4d04-813a-0123a7914777" (UID: "34067b0e-80e0-4d04-813a-0123a7914777"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.369365 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34067b0e-80e0-4d04-813a-0123a7914777-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "34067b0e-80e0-4d04-813a-0123a7914777" (UID: "34067b0e-80e0-4d04-813a-0123a7914777"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.369349 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34067b0e-80e0-4d04-813a-0123a7914777-kube-api-access-wctdr" (OuterVolumeSpecName: "kube-api-access-wctdr") pod "34067b0e-80e0-4d04-813a-0123a7914777" (UID: "34067b0e-80e0-4d04-813a-0123a7914777"). InnerVolumeSpecName "kube-api-access-wctdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.370680 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e" (UID: "7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.371347 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-kube-api-access-bl8tn" (OuterVolumeSpecName: "kube-api-access-bl8tn") pod "7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e" (UID: "7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e"). InnerVolumeSpecName "kube-api-access-bl8tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.415710 4689 patch_prober.go:28] interesting pod/route-controller-manager-5d75df64b8-nlp7z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.415773 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" podUID="34067b0e-80e0-4d04-813a-0123a7914777" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.463799 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcvjc\" (UniqueName: \"kubernetes.io/projected/e7f162fe-358d-4f03-833d-f7ce79ddad14-kube-api-access-gcvjc\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.463862 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-client-ca\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.463878 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-proxy-ca-bundles\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.463897 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f162fe-358d-4f03-833d-f7ce79ddad14-serving-cert\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.463917 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-config\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.464003 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34067b0e-80e0-4d04-813a-0123a7914777-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.464014 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wctdr\" (UniqueName: \"kubernetes.io/projected/34067b0e-80e0-4d04-813a-0123a7914777-kube-api-access-wctdr\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.464026 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34067b0e-80e0-4d04-813a-0123a7914777-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.464036 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl8tn\" (UniqueName: \"kubernetes.io/projected/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-kube-api-access-bl8tn\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.464044 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.464052 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.464060 4689 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.464068 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34067b0e-80e0-4d04-813a-0123a7914777-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.464076 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.465426 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-config\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.465497 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-client-ca\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.465723 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-proxy-ca-bundles\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.480389 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f162fe-358d-4f03-833d-f7ce79ddad14-serving-cert\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.484778 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcvjc\" (UniqueName: \"kubernetes.io/projected/e7f162fe-358d-4f03-833d-f7ce79ddad14-kube-api-access-gcvjc\") pod \"controller-manager-f98cd78bf-4hnzf\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.581686 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.664122 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" event={"ID":"34067b0e-80e0-4d04-813a-0123a7914777","Type":"ContainerDied","Data":"7b6a46435158e52176f81cc6e457b877550f657342e327d060b47a3fbf4e70c7"} Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.664190 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.664240 4689 scope.go:117] "RemoveContainer" containerID="28e082c2442371d7cfe04bd33ee338fb71b5329793e7fb4635839bbbc1056c1f" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.681033 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" event={"ID":"7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e","Type":"ContainerDied","Data":"9f176a7db84d386b761318a2bcb56e2da5786d7604eaa09558605430c95e9b5c"} Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.681191 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675db8c7f8-kd8kw" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.719852 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-675db8c7f8-kd8kw"] Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.722993 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-675db8c7f8-kd8kw"] Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.740281 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z"] Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.742237 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d75df64b8-nlp7z"] Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.835624 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34067b0e-80e0-4d04-813a-0123a7914777" path="/var/lib/kubelet/pods/34067b0e-80e0-4d04-813a-0123a7914777/volumes" Mar 07 04:23:39 crc kubenswrapper[4689]: I0307 04:23:39.836356 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e" path="/var/lib/kubelet/pods/7ad83d1b-fe79-4eea-8bb5-8f8e9d50a79e/volumes" Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.459700 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s654w" Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.604759 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.607536 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.618121 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.621236 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.622239 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.687962 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58886d59-a614-4590-ab02-ec000828f7f3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"58886d59-a614-4590-ab02-ec000828f7f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.688617 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58886d59-a614-4590-ab02-ec000828f7f3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"58886d59-a614-4590-ab02-ec000828f7f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.789214 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58886d59-a614-4590-ab02-ec000828f7f3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"58886d59-a614-4590-ab02-ec000828f7f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.789273 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58886d59-a614-4590-ab02-ec000828f7f3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"58886d59-a614-4590-ab02-ec000828f7f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.789356 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58886d59-a614-4590-ab02-ec000828f7f3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"58886d59-a614-4590-ab02-ec000828f7f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.818711 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58886d59-a614-4590-ab02-ec000828f7f3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"58886d59-a614-4590-ab02-ec000828f7f3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 04:23:40 crc kubenswrapper[4689]: I0307 04:23:40.948318 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 04:23:41 crc kubenswrapper[4689]: E0307 04:23:41.229754 4689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 07 04:23:41 crc kubenswrapper[4689]: E0307 04:23:41.229934 4689 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp2xw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-72r56_openshift-marketplace(18622abe-0dae-4a1b-83b8-8314bf342ccc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 04:23:41 crc kubenswrapper[4689]: E0307 04:23:41.231114 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-72r56" podUID="18622abe-0dae-4a1b-83b8-8314bf342ccc" Mar 07 04:23:42 crc kubenswrapper[4689]: E0307 04:23:42.938695 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-72r56" podUID="18622abe-0dae-4a1b-83b8-8314bf342ccc" Mar 07 04:23:43 crc kubenswrapper[4689]: E0307 04:23:43.014403 4689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 07 04:23:43 crc kubenswrapper[4689]: E0307 04:23:43.014577 4689 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wrct5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-fmghp_openshift-marketplace(c82c3040-48ed-473b-9386-d58d13364f29): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 04:23:43 crc kubenswrapper[4689]: E0307 04:23:43.015770 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-fmghp" podUID="c82c3040-48ed-473b-9386-d58d13364f29" Mar 07 04:23:43 crc kubenswrapper[4689]: E0307 04:23:43.068436 4689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 07 04:23:43 crc kubenswrapper[4689]: E0307 04:23:43.068611 4689 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfn55,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-chw2s_openshift-marketplace(99bbfad4-6baf-4ada-88b8-158f49957da5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 04:23:43 crc kubenswrapper[4689]: E0307 04:23:43.070098 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-chw2s" podUID="99bbfad4-6baf-4ada-88b8-158f49957da5" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.283853 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv"] Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.285592 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.288607 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.288985 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.289075 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.289695 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.289992 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.291509 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.297004 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv"] Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.423838 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-927k8\" (UniqueName: \"kubernetes.io/projected/3761a3df-c1e6-4279-b063-59fd7b2e24e3-kube-api-access-927k8\") pod \"route-controller-manager-67b99f4cc7-7rwsv\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.423906 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3761a3df-c1e6-4279-b063-59fd7b2e24e3-serving-cert\") pod \"route-controller-manager-67b99f4cc7-7rwsv\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.423960 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3761a3df-c1e6-4279-b063-59fd7b2e24e3-config\") pod \"route-controller-manager-67b99f4cc7-7rwsv\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.423977 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3761a3df-c1e6-4279-b063-59fd7b2e24e3-client-ca\") pod \"route-controller-manager-67b99f4cc7-7rwsv\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.524978 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-927k8\" (UniqueName: \"kubernetes.io/projected/3761a3df-c1e6-4279-b063-59fd7b2e24e3-kube-api-access-927k8\") pod \"route-controller-manager-67b99f4cc7-7rwsv\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.525038 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3761a3df-c1e6-4279-b063-59fd7b2e24e3-serving-cert\") pod \"route-controller-manager-67b99f4cc7-7rwsv\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.525092 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3761a3df-c1e6-4279-b063-59fd7b2e24e3-config\") pod \"route-controller-manager-67b99f4cc7-7rwsv\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.525111 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3761a3df-c1e6-4279-b063-59fd7b2e24e3-client-ca\") pod \"route-controller-manager-67b99f4cc7-7rwsv\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.525894 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3761a3df-c1e6-4279-b063-59fd7b2e24e3-client-ca\") pod \"route-controller-manager-67b99f4cc7-7rwsv\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.527649 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3761a3df-c1e6-4279-b063-59fd7b2e24e3-config\") pod \"route-controller-manager-67b99f4cc7-7rwsv\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.533290 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3761a3df-c1e6-4279-b063-59fd7b2e24e3-serving-cert\") pod \"route-controller-manager-67b99f4cc7-7rwsv\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.543161 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-927k8\" (UniqueName: \"kubernetes.io/projected/3761a3df-c1e6-4279-b063-59fd7b2e24e3-kube-api-access-927k8\") pod \"route-controller-manager-67b99f4cc7-7rwsv\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:43 crc kubenswrapper[4689]: I0307 04:23:43.611474 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.596465 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.597112 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.604974 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.738966 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.739032 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-kube-api-access\") pod \"installer-9-crc\" (UID: \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.739288 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-var-lock\") pod \"installer-9-crc\" (UID: \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.840433 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-var-lock\") pod \"installer-9-crc\" (UID: \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.840497 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.840560 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-kube-api-access\") pod \"installer-9-crc\" (UID: \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.840608 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-var-lock\") pod \"installer-9-crc\" (UID: \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.840638 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.859563 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-kube-api-access\") pod \"installer-9-crc\" (UID: \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:23:44 crc kubenswrapper[4689]: I0307 04:23:44.929026 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:23:45 crc kubenswrapper[4689]: I0307 04:23:45.626218 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f98cd78bf-4hnzf"] Mar 07 04:23:45 crc kubenswrapper[4689]: I0307 04:23:45.728184 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv"] Mar 07 04:23:47 crc kubenswrapper[4689]: I0307 04:23:47.081637 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 04:23:47 crc kubenswrapper[4689]: E0307 04:23:47.194331 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-chw2s" podUID="99bbfad4-6baf-4ada-88b8-158f49957da5" Mar 07 04:23:47 crc kubenswrapper[4689]: E0307 04:23:47.194760 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-fmghp" podUID="c82c3040-48ed-473b-9386-d58d13364f29" Mar 07 04:23:47 crc kubenswrapper[4689]: E0307 04:23:47.278791 4689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 07 04:23:47 crc kubenswrapper[4689]: E0307 04:23:47.279053 4689 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2kfv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5fhcx_openshift-marketplace(b84afefb-ca8f-4586-a7bc-6d733cb723b1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 04:23:47 crc kubenswrapper[4689]: E0307 04:23:47.281032 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5fhcx" podUID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" Mar 07 04:23:47 crc kubenswrapper[4689]: E0307 04:23:47.297398 4689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 07 04:23:47 crc kubenswrapper[4689]: E0307 04:23:47.297572 4689 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cptc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-tgr9z_openshift-marketplace(ec8159c9-c2bd-4af5-8b6b-b855bbd968a5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 04:23:47 crc kubenswrapper[4689]: E0307 04:23:47.299987 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-tgr9z" podUID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" Mar 07 04:23:48 crc kubenswrapper[4689]: E0307 04:23:48.760482 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5fhcx" podUID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" Mar 07 04:23:48 crc kubenswrapper[4689]: E0307 04:23:48.760741 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-tgr9z" podUID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" Mar 07 04:23:48 crc kubenswrapper[4689]: I0307 04:23:48.777211 4689 scope.go:117] "RemoveContainer" containerID="66e2812bb98574dceaf89a6944b6774aa06c33515ceebf80cdb13e53087e875f" Mar 07 04:23:48 crc kubenswrapper[4689]: E0307 04:23:48.817230 4689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 07 04:23:48 crc kubenswrapper[4689]: E0307 04:23:48.817478 4689 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kxh66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gc2hb_openshift-marketplace(d4a365d2-d74f-4675-b789-27bafa93fbff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 04:23:48 crc kubenswrapper[4689]: E0307 04:23:48.818892 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gc2hb" podUID="d4a365d2-d74f-4675-b789-27bafa93fbff" Mar 07 04:23:48 crc kubenswrapper[4689]: E0307 04:23:48.820440 4689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 07 04:23:48 crc kubenswrapper[4689]: E0307 04:23:48.820586 4689 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bgwsh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hvrwc_openshift-marketplace(fd0c8e82-4247-4dbb-b1a5-4a258259199c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 04:23:48 crc kubenswrapper[4689]: E0307 04:23:48.821726 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hvrwc" podUID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.268943 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.288873 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f98cd78bf-4hnzf"] Mar 07 04:23:49 crc kubenswrapper[4689]: W0307 04:23:49.291102 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod40d78e2e_6dbe_47ff_9db0_79bd0057c7d6.slice/crio-c42538de8beef5bb2a23bcc351e457ff150af13c057f04d6e3994eb608c90374 WatchSource:0}: Error finding container c42538de8beef5bb2a23bcc351e457ff150af13c057f04d6e3994eb608c90374: Status 404 returned error can't find the container with id c42538de8beef5bb2a23bcc351e457ff150af13c057f04d6e3994eb608c90374 Mar 07 04:23:49 crc kubenswrapper[4689]: W0307 04:23:49.301331 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7f162fe_358d_4f03_833d_f7ce79ddad14.slice/crio-86cd818e037928ae3cadbd9079c00a8ad83808f41bd7de708e9379fdd0b3b399 WatchSource:0}: Error finding container 86cd818e037928ae3cadbd9079c00a8ad83808f41bd7de708e9379fdd0b3b399: Status 404 returned error can't find the container with id 86cd818e037928ae3cadbd9079c00a8ad83808f41bd7de708e9379fdd0b3b399 Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.383829 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.396800 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv"] Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.749727 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" event={"ID":"e7f162fe-358d-4f03-833d-f7ce79ddad14","Type":"ContainerStarted","Data":"8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb"} Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.750264 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" event={"ID":"e7f162fe-358d-4f03-833d-f7ce79ddad14","Type":"ContainerStarted","Data":"86cd818e037928ae3cadbd9079c00a8ad83808f41bd7de708e9379fdd0b3b399"} Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.750285 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.749799 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" podUID="e7f162fe-358d-4f03-833d-f7ce79ddad14" containerName="controller-manager" containerID="cri-o://8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb" gracePeriod=30 Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.757070 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.757669 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-95vzv" event={"ID":"16e0e2e8-673a-446e-b377-f30ffd8edd1f","Type":"ContainerStarted","Data":"18681f1367381de387a7990385fdc80b54d82c970445ee579eb3597dccc47f5c"} Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.759536 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"58886d59-a614-4590-ab02-ec000828f7f3","Type":"ContainerStarted","Data":"996492c6bcdb4ca1948b587817c76e19f674d8dbc58b23fcbb50d0e5294ef7b8"} Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.759564 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"58886d59-a614-4590-ab02-ec000828f7f3","Type":"ContainerStarted","Data":"765e2c190a36f0843579d2f5522e5567cdc1edb255fc5ad2694dd255bb3b51c1"} Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.763727 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6","Type":"ContainerStarted","Data":"ed21409e18f5a88176a2be6c9001166797e80fd96201392fda2a091521f633fa"} Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.763754 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6","Type":"ContainerStarted","Data":"c42538de8beef5bb2a23bcc351e457ff150af13c057f04d6e3994eb608c90374"} Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.766078 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" event={"ID":"3761a3df-c1e6-4279-b063-59fd7b2e24e3","Type":"ContainerStarted","Data":"3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051"} Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.766559 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" event={"ID":"3761a3df-c1e6-4279-b063-59fd7b2e24e3","Type":"ContainerStarted","Data":"7604f871a7fd60ca68c9302fd13b4dfee6b4f13a276f3c9eb227a585c09acffd"} Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.766410 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" podUID="3761a3df-c1e6-4279-b063-59fd7b2e24e3" containerName="route-controller-manager" containerID="cri-o://3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051" gracePeriod=30 Mar 07 04:23:49 crc kubenswrapper[4689]: E0307 04:23:49.767363 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gc2hb" podUID="d4a365d2-d74f-4675-b789-27bafa93fbff" Mar 07 04:23:49 crc kubenswrapper[4689]: E0307 04:23:49.767463 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hvrwc" podUID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.793291 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" podStartSLOduration=24.793275281 podStartE2EDuration="24.793275281s" podCreationTimestamp="2026-03-07 04:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:49.770194566 +0000 UTC m=+274.816578065" watchObservedRunningTime="2026-03-07 04:23:49.793275281 +0000 UTC m=+274.839658770" Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.803014 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=9.802996764 podStartE2EDuration="9.802996764s" podCreationTimestamp="2026-03-07 04:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:49.801300308 +0000 UTC m=+274.847683807" watchObservedRunningTime="2026-03-07 04:23:49.802996764 +0000 UTC m=+274.849380253" Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.890688 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" podStartSLOduration=24.890671433 podStartE2EDuration="24.890671433s" podCreationTimestamp="2026-03-07 04:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:49.889529133 +0000 UTC m=+274.935912622" watchObservedRunningTime="2026-03-07 04:23:49.890671433 +0000 UTC m=+274.937054922" Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.892151 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=5.892142783 podStartE2EDuration="5.892142783s" podCreationTimestamp="2026-03-07 04:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:49.869540162 +0000 UTC m=+274.915923651" watchObservedRunningTime="2026-03-07 04:23:49.892142783 +0000 UTC m=+274.938526282" Mar 07 04:23:49 crc kubenswrapper[4689]: I0307 04:23:49.919411 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-95vzv" podStartSLOduration=210.91938593 podStartE2EDuration="3m30.91938593s" podCreationTimestamp="2026-03-07 04:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:49.909951914 +0000 UTC m=+274.956335403" watchObservedRunningTime="2026-03-07 04:23:49.91938593 +0000 UTC m=+274.965769419" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.154347 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.184920 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl"] Mar 07 04:23:50 crc kubenswrapper[4689]: E0307 04:23:50.185787 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f162fe-358d-4f03-833d-f7ce79ddad14" containerName="controller-manager" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.185801 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f162fe-358d-4f03-833d-f7ce79ddad14" containerName="controller-manager" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.185935 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f162fe-358d-4f03-833d-f7ce79ddad14" containerName="controller-manager" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.188337 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.197058 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl"] Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.229671 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-proxy-ca-bundles\") pod \"e7f162fe-358d-4f03-833d-f7ce79ddad14\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.229777 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f162fe-358d-4f03-833d-f7ce79ddad14-serving-cert\") pod \"e7f162fe-358d-4f03-833d-f7ce79ddad14\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.229855 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcvjc\" (UniqueName: \"kubernetes.io/projected/e7f162fe-358d-4f03-833d-f7ce79ddad14-kube-api-access-gcvjc\") pod \"e7f162fe-358d-4f03-833d-f7ce79ddad14\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.229955 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-client-ca\") pod \"e7f162fe-358d-4f03-833d-f7ce79ddad14\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.229992 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-config\") pod \"e7f162fe-358d-4f03-833d-f7ce79ddad14\" (UID: \"e7f162fe-358d-4f03-833d-f7ce79ddad14\") " Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.231873 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e7f162fe-358d-4f03-833d-f7ce79ddad14" (UID: "e7f162fe-358d-4f03-833d-f7ce79ddad14"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.232137 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-client-ca\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.232261 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p6rb\" (UniqueName: \"kubernetes.io/projected/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-kube-api-access-4p6rb\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.232368 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-client-ca" (OuterVolumeSpecName: "client-ca") pod "e7f162fe-358d-4f03-833d-f7ce79ddad14" (UID: "e7f162fe-358d-4f03-833d-f7ce79ddad14"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.232461 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-serving-cert\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.232508 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-proxy-ca-bundles\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.232549 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-config\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.233285 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-config" (OuterVolumeSpecName: "config") pod "e7f162fe-358d-4f03-833d-f7ce79ddad14" (UID: "e7f162fe-358d-4f03-833d-f7ce79ddad14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.233468 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.233498 4689 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.236354 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f162fe-358d-4f03-833d-f7ce79ddad14-kube-api-access-gcvjc" (OuterVolumeSpecName: "kube-api-access-gcvjc") pod "e7f162fe-358d-4f03-833d-f7ce79ddad14" (UID: "e7f162fe-358d-4f03-833d-f7ce79ddad14"). InnerVolumeSpecName "kube-api-access-gcvjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.236371 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f162fe-358d-4f03-833d-f7ce79ddad14-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7f162fe-358d-4f03-833d-f7ce79ddad14" (UID: "e7f162fe-358d-4f03-833d-f7ce79ddad14"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.248671 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-67b99f4cc7-7rwsv_3761a3df-c1e6-4279-b063-59fd7b2e24e3/route-controller-manager/0.log" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.248752 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.333787 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3761a3df-c1e6-4279-b063-59fd7b2e24e3-serving-cert\") pod \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.333917 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-927k8\" (UniqueName: \"kubernetes.io/projected/3761a3df-c1e6-4279-b063-59fd7b2e24e3-kube-api-access-927k8\") pod \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.333991 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3761a3df-c1e6-4279-b063-59fd7b2e24e3-client-ca\") pod \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.334036 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3761a3df-c1e6-4279-b063-59fd7b2e24e3-config\") pod \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\" (UID: \"3761a3df-c1e6-4279-b063-59fd7b2e24e3\") " Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.334219 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-client-ca\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.334266 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p6rb\" (UniqueName: \"kubernetes.io/projected/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-kube-api-access-4p6rb\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.334343 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-serving-cert\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.334378 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-proxy-ca-bundles\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.334408 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-config\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.334520 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f162fe-358d-4f03-833d-f7ce79ddad14-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.334540 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f162fe-358d-4f03-833d-f7ce79ddad14-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.334561 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcvjc\" (UniqueName: \"kubernetes.io/projected/e7f162fe-358d-4f03-833d-f7ce79ddad14-kube-api-access-gcvjc\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.334983 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3761a3df-c1e6-4279-b063-59fd7b2e24e3-client-ca" (OuterVolumeSpecName: "client-ca") pod "3761a3df-c1e6-4279-b063-59fd7b2e24e3" (UID: "3761a3df-c1e6-4279-b063-59fd7b2e24e3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.335154 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3761a3df-c1e6-4279-b063-59fd7b2e24e3-config" (OuterVolumeSpecName: "config") pod "3761a3df-c1e6-4279-b063-59fd7b2e24e3" (UID: "3761a3df-c1e6-4279-b063-59fd7b2e24e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.336874 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-proxy-ca-bundles\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.337575 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-client-ca\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.338216 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3761a3df-c1e6-4279-b063-59fd7b2e24e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3761a3df-c1e6-4279-b063-59fd7b2e24e3" (UID: "3761a3df-c1e6-4279-b063-59fd7b2e24e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.338636 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-config\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.339793 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3761a3df-c1e6-4279-b063-59fd7b2e24e3-kube-api-access-927k8" (OuterVolumeSpecName: "kube-api-access-927k8") pod "3761a3df-c1e6-4279-b063-59fd7b2e24e3" (UID: "3761a3df-c1e6-4279-b063-59fd7b2e24e3"). InnerVolumeSpecName "kube-api-access-927k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.340843 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-serving-cert\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.358028 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p6rb\" (UniqueName: \"kubernetes.io/projected/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-kube-api-access-4p6rb\") pod \"controller-manager-76ddcc5b56-qfvbl\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.435804 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3761a3df-c1e6-4279-b063-59fd7b2e24e3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.435873 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3761a3df-c1e6-4279-b063-59fd7b2e24e3-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.435891 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3761a3df-c1e6-4279-b063-59fd7b2e24e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.435910 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-927k8\" (UniqueName: \"kubernetes.io/projected/3761a3df-c1e6-4279-b063-59fd7b2e24e3-kube-api-access-927k8\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.502123 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.705791 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl"] Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.773447 4689 generic.go:334] "Generic (PLEG): container finished" podID="e7f162fe-358d-4f03-833d-f7ce79ddad14" containerID="8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb" exitCode=0 Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.773575 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.774269 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" event={"ID":"e7f162fe-358d-4f03-833d-f7ce79ddad14","Type":"ContainerDied","Data":"8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb"} Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.774315 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f98cd78bf-4hnzf" event={"ID":"e7f162fe-358d-4f03-833d-f7ce79ddad14","Type":"ContainerDied","Data":"86cd818e037928ae3cadbd9079c00a8ad83808f41bd7de708e9379fdd0b3b399"} Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.774335 4689 scope.go:117] "RemoveContainer" containerID="8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.775918 4689 generic.go:334] "Generic (PLEG): container finished" podID="58886d59-a614-4590-ab02-ec000828f7f3" containerID="996492c6bcdb4ca1948b587817c76e19f674d8dbc58b23fcbb50d0e5294ef7b8" exitCode=0 Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.776164 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"58886d59-a614-4590-ab02-ec000828f7f3","Type":"ContainerDied","Data":"996492c6bcdb4ca1948b587817c76e19f674d8dbc58b23fcbb50d0e5294ef7b8"} Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.782691 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-67b99f4cc7-7rwsv_3761a3df-c1e6-4279-b063-59fd7b2e24e3/route-controller-manager/0.log" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.782775 4689 generic.go:334] "Generic (PLEG): container finished" podID="3761a3df-c1e6-4279-b063-59fd7b2e24e3" containerID="3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051" exitCode=255 Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.782869 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" event={"ID":"3761a3df-c1e6-4279-b063-59fd7b2e24e3","Type":"ContainerDied","Data":"3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051"} Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.782920 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" event={"ID":"3761a3df-c1e6-4279-b063-59fd7b2e24e3","Type":"ContainerDied","Data":"7604f871a7fd60ca68c9302fd13b4dfee6b4f13a276f3c9eb227a585c09acffd"} Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.783028 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.786301 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" event={"ID":"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97","Type":"ContainerStarted","Data":"66630a5fc8b080e711b9d6067e12f9b38b4b7b06387b71633020449c2dd348ea"} Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.822260 4689 scope.go:117] "RemoveContainer" containerID="8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb" Mar 07 04:23:50 crc kubenswrapper[4689]: E0307 04:23:50.822732 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb\": container with ID starting with 8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb not found: ID does not exist" containerID="8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.822793 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb"} err="failed to get container status \"8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb\": rpc error: code = NotFound desc = could not find container \"8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb\": container with ID starting with 8a67df9f87def7193998b663965cfcaffb01d57e1d65d8213164efff3aa94afb not found: ID does not exist" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.822829 4689 scope.go:117] "RemoveContainer" containerID="3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.847233 4689 scope.go:117] "RemoveContainer" containerID="3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051" Mar 07 04:23:50 crc kubenswrapper[4689]: E0307 04:23:50.847746 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051\": container with ID starting with 3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051 not found: ID does not exist" containerID="3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.847786 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051"} err="failed to get container status \"3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051\": rpc error: code = NotFound desc = could not find container \"3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051\": container with ID starting with 3d1138f247cd772351b9eadfb6e6023f9c30962eb53844cd4e5fa076aef60051 not found: ID does not exist" Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.860568 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f98cd78bf-4hnzf"] Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.878074 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f98cd78bf-4hnzf"] Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.881069 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv"] Mar 07 04:23:50 crc kubenswrapper[4689]: I0307 04:23:50.883534 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b99f4cc7-7rwsv"] Mar 07 04:23:51 crc kubenswrapper[4689]: I0307 04:23:51.805342 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" event={"ID":"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97","Type":"ContainerStarted","Data":"cc36d7839515456fff41aa54e4dee9e0f91ffcef5480252e04939dc471411049"} Mar 07 04:23:51 crc kubenswrapper[4689]: I0307 04:23:51.826663 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" podStartSLOduration=6.826619951 podStartE2EDuration="6.826619951s" podCreationTimestamp="2026-03-07 04:23:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:51.823397294 +0000 UTC m=+276.869780803" watchObservedRunningTime="2026-03-07 04:23:51.826619951 +0000 UTC m=+276.873003440" Mar 07 04:23:51 crc kubenswrapper[4689]: I0307 04:23:51.845510 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3761a3df-c1e6-4279-b063-59fd7b2e24e3" path="/var/lib/kubelet/pods/3761a3df-c1e6-4279-b063-59fd7b2e24e3/volumes" Mar 07 04:23:51 crc kubenswrapper[4689]: I0307 04:23:51.846084 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f162fe-358d-4f03-833d-f7ce79ddad14" path="/var/lib/kubelet/pods/e7f162fe-358d-4f03-833d-f7ce79ddad14/volumes" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.074661 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.189494 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58886d59-a614-4590-ab02-ec000828f7f3-kubelet-dir\") pod \"58886d59-a614-4590-ab02-ec000828f7f3\" (UID: \"58886d59-a614-4590-ab02-ec000828f7f3\") " Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.189579 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58886d59-a614-4590-ab02-ec000828f7f3-kube-api-access\") pod \"58886d59-a614-4590-ab02-ec000828f7f3\" (UID: \"58886d59-a614-4590-ab02-ec000828f7f3\") " Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.189751 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58886d59-a614-4590-ab02-ec000828f7f3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "58886d59-a614-4590-ab02-ec000828f7f3" (UID: "58886d59-a614-4590-ab02-ec000828f7f3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.189932 4689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58886d59-a614-4590-ab02-ec000828f7f3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.196270 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58886d59-a614-4590-ab02-ec000828f7f3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "58886d59-a614-4590-ab02-ec000828f7f3" (UID: "58886d59-a614-4590-ab02-ec000828f7f3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.291435 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58886d59-a614-4590-ab02-ec000828f7f3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.292380 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp"] Mar 07 04:23:52 crc kubenswrapper[4689]: E0307 04:23:52.292700 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58886d59-a614-4590-ab02-ec000828f7f3" containerName="pruner" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.292718 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="58886d59-a614-4590-ab02-ec000828f7f3" containerName="pruner" Mar 07 04:23:52 crc kubenswrapper[4689]: E0307 04:23:52.292735 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3761a3df-c1e6-4279-b063-59fd7b2e24e3" containerName="route-controller-manager" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.292743 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="3761a3df-c1e6-4279-b063-59fd7b2e24e3" containerName="route-controller-manager" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.292856 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="3761a3df-c1e6-4279-b063-59fd7b2e24e3" containerName="route-controller-manager" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.292874 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="58886d59-a614-4590-ab02-ec000828f7f3" containerName="pruner" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.293489 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.297632 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.298090 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.299494 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.299637 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.299772 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.300271 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.309163 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp"] Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.392691 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-client-ca\") pod \"route-controller-manager-6b6787b7dd-k2ccp\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.392766 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-serving-cert\") pod \"route-controller-manager-6b6787b7dd-k2ccp\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.392794 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhjf4\" (UniqueName: \"kubernetes.io/projected/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-kube-api-access-lhjf4\") pod \"route-controller-manager-6b6787b7dd-k2ccp\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.393050 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-config\") pod \"route-controller-manager-6b6787b7dd-k2ccp\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.495983 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-serving-cert\") pod \"route-controller-manager-6b6787b7dd-k2ccp\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.496079 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhjf4\" (UniqueName: \"kubernetes.io/projected/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-kube-api-access-lhjf4\") pod \"route-controller-manager-6b6787b7dd-k2ccp\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.496109 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-config\") pod \"route-controller-manager-6b6787b7dd-k2ccp\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.496219 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-client-ca\") pod \"route-controller-manager-6b6787b7dd-k2ccp\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.498047 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-config\") pod \"route-controller-manager-6b6787b7dd-k2ccp\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.498452 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-client-ca\") pod \"route-controller-manager-6b6787b7dd-k2ccp\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.502366 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-serving-cert\") pod \"route-controller-manager-6b6787b7dd-k2ccp\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.519614 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhjf4\" (UniqueName: \"kubernetes.io/projected/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-kube-api-access-lhjf4\") pod \"route-controller-manager-6b6787b7dd-k2ccp\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.688637 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.826258 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.826626 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"58886d59-a614-4590-ab02-ec000828f7f3","Type":"ContainerDied","Data":"765e2c190a36f0843579d2f5522e5567cdc1edb255fc5ad2694dd255bb3b51c1"} Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.827338 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="765e2c190a36f0843579d2f5522e5567cdc1edb255fc5ad2694dd255bb3b51c1" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.827369 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:52 crc kubenswrapper[4689]: I0307 04:23:52.835204 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:23:54 crc kubenswrapper[4689]: I0307 04:23:53.137789 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp"] Mar 07 04:23:54 crc kubenswrapper[4689]: I0307 04:23:53.833112 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" event={"ID":"41c8a0b4-594d-4c2d-80e5-8dc6920959a6","Type":"ContainerStarted","Data":"08dd80ecc252a31f24f6520b27a2080bb6597ec42c26777be1e2f293f618d852"} Mar 07 04:23:54 crc kubenswrapper[4689]: I0307 04:23:53.833145 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" event={"ID":"41c8a0b4-594d-4c2d-80e5-8dc6920959a6","Type":"ContainerStarted","Data":"c1a7bd507fa1f0571d2d84c3aecde20e6fac80b0997b518661dbf99fa8a4ad4d"} Mar 07 04:23:54 crc kubenswrapper[4689]: I0307 04:23:53.852039 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" podStartSLOduration=8.852018417 podStartE2EDuration="8.852018417s" podCreationTimestamp="2026-03-07 04:23:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:23:53.849195191 +0000 UTC m=+278.895578680" watchObservedRunningTime="2026-03-07 04:23:53.852018417 +0000 UTC m=+278.898401906" Mar 07 04:23:54 crc kubenswrapper[4689]: I0307 04:23:54.838774 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:54 crc kubenswrapper[4689]: I0307 04:23:54.845662 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:23:59 crc kubenswrapper[4689]: I0307 04:23:59.190455 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:23:59 crc kubenswrapper[4689]: I0307 04:23:59.190682 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:23:59 crc kubenswrapper[4689]: I0307 04:23:59.190894 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:23:59 crc kubenswrapper[4689]: I0307 04:23:59.192537 4689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83"} pod="openshift-machine-config-operator/machine-config-daemon-dss5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 04:23:59 crc kubenswrapper[4689]: I0307 04:23:59.192749 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" containerID="cri-o://75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83" gracePeriod=600 Mar 07 04:23:59 crc kubenswrapper[4689]: I0307 04:23:59.870444 4689 generic.go:334] "Generic (PLEG): container finished" podID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerID="75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83" exitCode=0 Mar 07 04:23:59 crc kubenswrapper[4689]: I0307 04:23:59.870518 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerDied","Data":"75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83"} Mar 07 04:24:00 crc kubenswrapper[4689]: I0307 04:24:00.149146 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547624-v2j6p"] Mar 07 04:24:00 crc kubenswrapper[4689]: I0307 04:24:00.150809 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547624-v2j6p" Mar 07 04:24:00 crc kubenswrapper[4689]: I0307 04:24:00.155585 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:24:00 crc kubenswrapper[4689]: I0307 04:24:00.161232 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547624-v2j6p"] Mar 07 04:24:00 crc kubenswrapper[4689]: I0307 04:24:00.218255 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxzb\" (UniqueName: \"kubernetes.io/projected/d98532b6-658d-41de-8e97-0f941ad34251-kube-api-access-4dxzb\") pod \"auto-csr-approver-29547624-v2j6p\" (UID: \"d98532b6-658d-41de-8e97-0f941ad34251\") " pod="openshift-infra/auto-csr-approver-29547624-v2j6p" Mar 07 04:24:00 crc kubenswrapper[4689]: I0307 04:24:00.320271 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxzb\" (UniqueName: \"kubernetes.io/projected/d98532b6-658d-41de-8e97-0f941ad34251-kube-api-access-4dxzb\") pod \"auto-csr-approver-29547624-v2j6p\" (UID: \"d98532b6-658d-41de-8e97-0f941ad34251\") " pod="openshift-infra/auto-csr-approver-29547624-v2j6p" Mar 07 04:24:00 crc kubenswrapper[4689]: I0307 04:24:00.344061 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxzb\" (UniqueName: \"kubernetes.io/projected/d98532b6-658d-41de-8e97-0f941ad34251-kube-api-access-4dxzb\") pod \"auto-csr-approver-29547624-v2j6p\" (UID: \"d98532b6-658d-41de-8e97-0f941ad34251\") " pod="openshift-infra/auto-csr-approver-29547624-v2j6p" Mar 07 04:24:00 crc kubenswrapper[4689]: I0307 04:24:00.498280 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547624-v2j6p" Mar 07 04:24:00 crc kubenswrapper[4689]: I0307 04:24:00.887645 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wh2s" event={"ID":"98a53e64-9323-454c-9de0-a8d348182a64","Type":"ContainerStarted","Data":"f5182ccad7ad084666ecb485eae286df8d43f273e433ae0b099873949b374b47"} Mar 07 04:24:00 crc kubenswrapper[4689]: I0307 04:24:00.890069 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547622-4796h" event={"ID":"33a94bd2-f479-403b-9c36-a708410864aa","Type":"ContainerStarted","Data":"917935269d5fc94cd233e15f2825a1cb7041f968721bc786a2a97c66eb8a5338"} Mar 07 04:24:00 crc kubenswrapper[4689]: I0307 04:24:00.893445 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerStarted","Data":"23812cdb895a5f0e0a59b8a60c77194b6f8d32629f6b8cae7e8e7f3fc587e614"} Mar 07 04:24:01 crc kubenswrapper[4689]: I0307 04:24:01.022636 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547624-v2j6p"] Mar 07 04:24:01 crc kubenswrapper[4689]: I0307 04:24:01.326953 4689 csr.go:261] certificate signing request csr-wcgfx is approved, waiting to be issued Mar 07 04:24:01 crc kubenswrapper[4689]: I0307 04:24:01.342058 4689 csr.go:257] certificate signing request csr-wcgfx is issued Mar 07 04:24:01 crc kubenswrapper[4689]: I0307 04:24:01.906258 4689 generic.go:334] "Generic (PLEG): container finished" podID="98a53e64-9323-454c-9de0-a8d348182a64" containerID="f5182ccad7ad084666ecb485eae286df8d43f273e433ae0b099873949b374b47" exitCode=0 Mar 07 04:24:01 crc kubenswrapper[4689]: I0307 04:24:01.906386 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wh2s" event={"ID":"98a53e64-9323-454c-9de0-a8d348182a64","Type":"ContainerDied","Data":"f5182ccad7ad084666ecb485eae286df8d43f273e433ae0b099873949b374b47"} Mar 07 04:24:01 crc kubenswrapper[4689]: I0307 04:24:01.909766 4689 generic.go:334] "Generic (PLEG): container finished" podID="33a94bd2-f479-403b-9c36-a708410864aa" containerID="917935269d5fc94cd233e15f2825a1cb7041f968721bc786a2a97c66eb8a5338" exitCode=0 Mar 07 04:24:01 crc kubenswrapper[4689]: I0307 04:24:01.909927 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547622-4796h" event={"ID":"33a94bd2-f479-403b-9c36-a708410864aa","Type":"ContainerDied","Data":"917935269d5fc94cd233e15f2825a1cb7041f968721bc786a2a97c66eb8a5338"} Mar 07 04:24:01 crc kubenswrapper[4689]: I0307 04:24:01.912886 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547624-v2j6p" event={"ID":"d98532b6-658d-41de-8e97-0f941ad34251","Type":"ContainerStarted","Data":"b7d8193e1b9a1c7a6474f56ddc441f3ca260835d135077e537d91239fa43dd97"} Mar 07 04:24:02 crc kubenswrapper[4689]: I0307 04:24:02.344584 4689 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-06 05:07:34.052701402 +0000 UTC Mar 07 04:24:02 crc kubenswrapper[4689]: I0307 04:24:02.345044 4689 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7320h43m31.707662252s for next certificate rotation Mar 07 04:24:02 crc kubenswrapper[4689]: I0307 04:24:02.926277 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gc2hb" event={"ID":"d4a365d2-d74f-4675-b789-27bafa93fbff","Type":"ContainerStarted","Data":"759bcb7508f2585a20d2e984948abd716013456d87e2418a66f10cb6cb385205"} Mar 07 04:24:02 crc kubenswrapper[4689]: I0307 04:24:02.928559 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvrwc" event={"ID":"fd0c8e82-4247-4dbb-b1a5-4a258259199c","Type":"ContainerStarted","Data":"7282802cb62b4284bd840ee5b1d33d0e80c19fe39a202ae86ce039eea1475e73"} Mar 07 04:24:02 crc kubenswrapper[4689]: I0307 04:24:02.936631 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmghp" event={"ID":"c82c3040-48ed-473b-9386-d58d13364f29","Type":"ContainerStarted","Data":"1224fb20657f8194ecfb5b6dfde07a1c63f2de0452232a2c39ea89a80496b2f7"} Mar 07 04:24:02 crc kubenswrapper[4689]: I0307 04:24:02.938896 4689 generic.go:334] "Generic (PLEG): container finished" podID="18622abe-0dae-4a1b-83b8-8314bf342ccc" containerID="7ab433c9d3b663b61fde835084897dfe5badd525c40323e2c79e8aca26a854f7" exitCode=0 Mar 07 04:24:02 crc kubenswrapper[4689]: I0307 04:24:02.938976 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72r56" event={"ID":"18622abe-0dae-4a1b-83b8-8314bf342ccc","Type":"ContainerDied","Data":"7ab433c9d3b663b61fde835084897dfe5badd525c40323e2c79e8aca26a854f7"} Mar 07 04:24:02 crc kubenswrapper[4689]: I0307 04:24:02.949863 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wh2s" event={"ID":"98a53e64-9323-454c-9de0-a8d348182a64","Type":"ContainerStarted","Data":"c5e1d6e56bca9d9d91c9496d08294c8759b8b4423f9fd884ba459cdde6a3fdbe"} Mar 07 04:24:02 crc kubenswrapper[4689]: I0307 04:24:02.952210 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgr9z" event={"ID":"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5","Type":"ContainerStarted","Data":"d6f4f56df28874c6cec59e556d26e3408f2303402255b872cd72b0e4ccfcc540"} Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.018938 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2wh2s" podStartSLOduration=3.606628648 podStartE2EDuration="54.018922246s" podCreationTimestamp="2026-03-07 04:23:09 +0000 UTC" firstStartedPulling="2026-03-07 04:23:12.110465436 +0000 UTC m=+237.156848925" lastFinishedPulling="2026-03-07 04:24:02.522759024 +0000 UTC m=+287.569142523" observedRunningTime="2026-03-07 04:24:03.016575342 +0000 UTC m=+288.062958831" watchObservedRunningTime="2026-03-07 04:24:03.018922246 +0000 UTC m=+288.065305735" Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.313714 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547622-4796h" Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.368090 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnhp7\" (UniqueName: \"kubernetes.io/projected/33a94bd2-f479-403b-9c36-a708410864aa-kube-api-access-fnhp7\") pod \"33a94bd2-f479-403b-9c36-a708410864aa\" (UID: \"33a94bd2-f479-403b-9c36-a708410864aa\") " Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.375760 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33a94bd2-f479-403b-9c36-a708410864aa-kube-api-access-fnhp7" (OuterVolumeSpecName: "kube-api-access-fnhp7") pod "33a94bd2-f479-403b-9c36-a708410864aa" (UID: "33a94bd2-f479-403b-9c36-a708410864aa"). InnerVolumeSpecName "kube-api-access-fnhp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.469969 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnhp7\" (UniqueName: \"kubernetes.io/projected/33a94bd2-f479-403b-9c36-a708410864aa-kube-api-access-fnhp7\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.974700 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72r56" event={"ID":"18622abe-0dae-4a1b-83b8-8314bf342ccc","Type":"ContainerStarted","Data":"5d3ec338aa5763b8677b48433aecdd0ae0b4cb67ca8727d892f44eb8dfc55a84"} Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.976684 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547622-4796h" event={"ID":"33a94bd2-f479-403b-9c36-a708410864aa","Type":"ContainerDied","Data":"cf868dfbf890c019a784d80bdd222eded7347837386dbf6a8bb49a62941384ee"} Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.976730 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf868dfbf890c019a784d80bdd222eded7347837386dbf6a8bb49a62941384ee" Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.976815 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547622-4796h" Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.981192 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chw2s" event={"ID":"99bbfad4-6baf-4ada-88b8-158f49957da5","Type":"ContainerStarted","Data":"2119b807ba910329fe2bcb1c84fc119e4e511acab92fd0b5a963ba6a463b0e0e"} Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.984265 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fhcx" event={"ID":"b84afefb-ca8f-4586-a7bc-6d733cb723b1","Type":"ContainerStarted","Data":"4be73f5f6dcfed4bf698cb70894ba2d78eb52e81facd27c5142834df5978631e"} Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.986477 4689 generic.go:334] "Generic (PLEG): container finished" podID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" containerID="d6f4f56df28874c6cec59e556d26e3408f2303402255b872cd72b0e4ccfcc540" exitCode=0 Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.986523 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgr9z" event={"ID":"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5","Type":"ContainerDied","Data":"d6f4f56df28874c6cec59e556d26e3408f2303402255b872cd72b0e4ccfcc540"} Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.989219 4689 generic.go:334] "Generic (PLEG): container finished" podID="d4a365d2-d74f-4675-b789-27bafa93fbff" containerID="759bcb7508f2585a20d2e984948abd716013456d87e2418a66f10cb6cb385205" exitCode=0 Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.989252 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gc2hb" event={"ID":"d4a365d2-d74f-4675-b789-27bafa93fbff","Type":"ContainerDied","Data":"759bcb7508f2585a20d2e984948abd716013456d87e2418a66f10cb6cb385205"} Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.993853 4689 generic.go:334] "Generic (PLEG): container finished" podID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" containerID="7282802cb62b4284bd840ee5b1d33d0e80c19fe39a202ae86ce039eea1475e73" exitCode=0 Mar 07 04:24:03 crc kubenswrapper[4689]: I0307 04:24:03.993938 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvrwc" event={"ID":"fd0c8e82-4247-4dbb-b1a5-4a258259199c","Type":"ContainerDied","Data":"7282802cb62b4284bd840ee5b1d33d0e80c19fe39a202ae86ce039eea1475e73"} Mar 07 04:24:04 crc kubenswrapper[4689]: I0307 04:24:04.003265 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-72r56" podStartSLOduration=3.552505993 podStartE2EDuration="56.003244262s" podCreationTimestamp="2026-03-07 04:23:08 +0000 UTC" firstStartedPulling="2026-03-07 04:23:11.009007853 +0000 UTC m=+236.055391342" lastFinishedPulling="2026-03-07 04:24:03.459746122 +0000 UTC m=+288.506129611" observedRunningTime="2026-03-07 04:24:03.999914081 +0000 UTC m=+289.046297600" watchObservedRunningTime="2026-03-07 04:24:04.003244262 +0000 UTC m=+289.049627761" Mar 07 04:24:04 crc kubenswrapper[4689]: I0307 04:24:04.014480 4689 generic.go:334] "Generic (PLEG): container finished" podID="d98532b6-658d-41de-8e97-0f941ad34251" containerID="f97a9da58e3b528f5f004a14e794d6ed3b0c80c34b9269631df208850715d7f2" exitCode=0 Mar 07 04:24:04 crc kubenswrapper[4689]: I0307 04:24:04.014614 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547624-v2j6p" event={"ID":"d98532b6-658d-41de-8e97-0f941ad34251","Type":"ContainerDied","Data":"f97a9da58e3b528f5f004a14e794d6ed3b0c80c34b9269631df208850715d7f2"} Mar 07 04:24:04 crc kubenswrapper[4689]: I0307 04:24:04.020650 4689 generic.go:334] "Generic (PLEG): container finished" podID="c82c3040-48ed-473b-9386-d58d13364f29" containerID="1224fb20657f8194ecfb5b6dfde07a1c63f2de0452232a2c39ea89a80496b2f7" exitCode=0 Mar 07 04:24:04 crc kubenswrapper[4689]: I0307 04:24:04.020737 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmghp" event={"ID":"c82c3040-48ed-473b-9386-d58d13364f29","Type":"ContainerDied","Data":"1224fb20657f8194ecfb5b6dfde07a1c63f2de0452232a2c39ea89a80496b2f7"} Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.026939 4689 generic.go:334] "Generic (PLEG): container finished" podID="99bbfad4-6baf-4ada-88b8-158f49957da5" containerID="2119b807ba910329fe2bcb1c84fc119e4e511acab92fd0b5a963ba6a463b0e0e" exitCode=0 Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.027013 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chw2s" event={"ID":"99bbfad4-6baf-4ada-88b8-158f49957da5","Type":"ContainerDied","Data":"2119b807ba910329fe2bcb1c84fc119e4e511acab92fd0b5a963ba6a463b0e0e"} Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.029735 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvrwc" event={"ID":"fd0c8e82-4247-4dbb-b1a5-4a258259199c","Type":"ContainerStarted","Data":"e688ca2a50484d2e27ff1b3acc6a67a90b679e4083c06eccffb6c2e49b5a4484"} Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.033502 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmghp" event={"ID":"c82c3040-48ed-473b-9386-d58d13364f29","Type":"ContainerStarted","Data":"5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d"} Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.036859 4689 generic.go:334] "Generic (PLEG): container finished" podID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" containerID="4be73f5f6dcfed4bf698cb70894ba2d78eb52e81facd27c5142834df5978631e" exitCode=0 Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.036936 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fhcx" event={"ID":"b84afefb-ca8f-4586-a7bc-6d733cb723b1","Type":"ContainerDied","Data":"4be73f5f6dcfed4bf698cb70894ba2d78eb52e81facd27c5142834df5978631e"} Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.040866 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgr9z" event={"ID":"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5","Type":"ContainerStarted","Data":"eea2a417227a70af19d80116d0b67a59e65a9773db20ae2254f9852d6a6682bd"} Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.046207 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gc2hb" event={"ID":"d4a365d2-d74f-4675-b789-27bafa93fbff","Type":"ContainerStarted","Data":"891e3d2d19f1f2030d54aeeb15c5472652e84fd8d62eb840acd78c4b67b9f36f"} Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.078238 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tgr9z" podStartSLOduration=2.5907236129999998 podStartE2EDuration="57.078222349s" podCreationTimestamp="2026-03-07 04:23:08 +0000 UTC" firstStartedPulling="2026-03-07 04:23:09.897362691 +0000 UTC m=+234.943746180" lastFinishedPulling="2026-03-07 04:24:04.384861427 +0000 UTC m=+289.431244916" observedRunningTime="2026-03-07 04:24:05.077011646 +0000 UTC m=+290.123395135" watchObservedRunningTime="2026-03-07 04:24:05.078222349 +0000 UTC m=+290.124605838" Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.120475 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hvrwc" podStartSLOduration=3.466570057 podStartE2EDuration="59.12045944s" podCreationTimestamp="2026-03-07 04:23:06 +0000 UTC" firstStartedPulling="2026-03-07 04:23:08.75524344 +0000 UTC m=+233.801626919" lastFinishedPulling="2026-03-07 04:24:04.409132813 +0000 UTC m=+289.455516302" observedRunningTime="2026-03-07 04:24:05.11712967 +0000 UTC m=+290.163513169" watchObservedRunningTime="2026-03-07 04:24:05.12045944 +0000 UTC m=+290.166842929" Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.182286 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gc2hb" podStartSLOduration=3.582065178 podStartE2EDuration="59.182267161s" podCreationTimestamp="2026-03-07 04:23:06 +0000 UTC" firstStartedPulling="2026-03-07 04:23:08.831771277 +0000 UTC m=+233.878154756" lastFinishedPulling="2026-03-07 04:24:04.43197325 +0000 UTC m=+289.478356739" observedRunningTime="2026-03-07 04:24:05.180887273 +0000 UTC m=+290.227270762" watchObservedRunningTime="2026-03-07 04:24:05.182267161 +0000 UTC m=+290.228650650" Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.208832 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fmghp" podStartSLOduration=2.278107939 podStartE2EDuration="59.208807058s" podCreationTimestamp="2026-03-07 04:23:06 +0000 UTC" firstStartedPulling="2026-03-07 04:23:07.631640715 +0000 UTC m=+232.678024204" lastFinishedPulling="2026-03-07 04:24:04.562339834 +0000 UTC m=+289.608723323" observedRunningTime="2026-03-07 04:24:05.20849414 +0000 UTC m=+290.254877639" watchObservedRunningTime="2026-03-07 04:24:05.208807058 +0000 UTC m=+290.255190547" Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.480884 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547624-v2j6p" Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.501682 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dxzb\" (UniqueName: \"kubernetes.io/projected/d98532b6-658d-41de-8e97-0f941ad34251-kube-api-access-4dxzb\") pod \"d98532b6-658d-41de-8e97-0f941ad34251\" (UID: \"d98532b6-658d-41de-8e97-0f941ad34251\") " Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.512492 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98532b6-658d-41de-8e97-0f941ad34251-kube-api-access-4dxzb" (OuterVolumeSpecName: "kube-api-access-4dxzb") pod "d98532b6-658d-41de-8e97-0f941ad34251" (UID: "d98532b6-658d-41de-8e97-0f941ad34251"). InnerVolumeSpecName "kube-api-access-4dxzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.604058 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dxzb\" (UniqueName: \"kubernetes.io/projected/d98532b6-658d-41de-8e97-0f941ad34251-kube-api-access-4dxzb\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.678955 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl"] Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.679488 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" podUID="0d82a6b2-6b23-4036-8eb5-fb6b1366bc97" containerName="controller-manager" containerID="cri-o://cc36d7839515456fff41aa54e4dee9e0f91ffcef5480252e04939dc471411049" gracePeriod=30 Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.692848 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp"] Mar 07 04:24:05 crc kubenswrapper[4689]: I0307 04:24:05.693245 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" podUID="41c8a0b4-594d-4c2d-80e5-8dc6920959a6" containerName="route-controller-manager" containerID="cri-o://08dd80ecc252a31f24f6520b27a2080bb6597ec42c26777be1e2f293f618d852" gracePeriod=30 Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.054638 4689 generic.go:334] "Generic (PLEG): container finished" podID="0d82a6b2-6b23-4036-8eb5-fb6b1366bc97" containerID="cc36d7839515456fff41aa54e4dee9e0f91ffcef5480252e04939dc471411049" exitCode=0 Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.054736 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" event={"ID":"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97","Type":"ContainerDied","Data":"cc36d7839515456fff41aa54e4dee9e0f91ffcef5480252e04939dc471411049"} Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.065113 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547624-v2j6p" event={"ID":"d98532b6-658d-41de-8e97-0f941ad34251","Type":"ContainerDied","Data":"b7d8193e1b9a1c7a6474f56ddc441f3ca260835d135077e537d91239fa43dd97"} Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.065161 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7d8193e1b9a1c7a6474f56ddc441f3ca260835d135077e537d91239fa43dd97" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.065185 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547624-v2j6p" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.068740 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chw2s" event={"ID":"99bbfad4-6baf-4ada-88b8-158f49957da5","Type":"ContainerStarted","Data":"ba5f0a13bfa0baabfe33d32c22a2462159f366c9bf71c75679855771d7da1d48"} Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.072255 4689 generic.go:334] "Generic (PLEG): container finished" podID="41c8a0b4-594d-4c2d-80e5-8dc6920959a6" containerID="08dd80ecc252a31f24f6520b27a2080bb6597ec42c26777be1e2f293f618d852" exitCode=0 Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.072346 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" event={"ID":"41c8a0b4-594d-4c2d-80e5-8dc6920959a6","Type":"ContainerDied","Data":"08dd80ecc252a31f24f6520b27a2080bb6597ec42c26777be1e2f293f618d852"} Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.086157 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fhcx" event={"ID":"b84afefb-ca8f-4586-a7bc-6d733cb723b1","Type":"ContainerStarted","Data":"0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215"} Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.089304 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-chw2s" podStartSLOduration=3.412436086 podStartE2EDuration="1m0.089286077s" podCreationTimestamp="2026-03-07 04:23:06 +0000 UTC" firstStartedPulling="2026-03-07 04:23:08.776729873 +0000 UTC m=+233.823113362" lastFinishedPulling="2026-03-07 04:24:05.453579864 +0000 UTC m=+290.499963353" observedRunningTime="2026-03-07 04:24:06.085967917 +0000 UTC m=+291.132351406" watchObservedRunningTime="2026-03-07 04:24:06.089286077 +0000 UTC m=+291.135669576" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.416765 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.417662 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.519500 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5fhcx" podStartSLOduration=4.169056093 podStartE2EDuration="57.519480005s" podCreationTimestamp="2026-03-07 04:23:09 +0000 UTC" firstStartedPulling="2026-03-07 04:23:12.10640181 +0000 UTC m=+237.152785299" lastFinishedPulling="2026-03-07 04:24:05.456825722 +0000 UTC m=+290.503209211" observedRunningTime="2026-03-07 04:24:06.118739283 +0000 UTC m=+291.165122772" watchObservedRunningTime="2026-03-07 04:24:06.519480005 +0000 UTC m=+291.565863494" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.636327 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.636485 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.868376 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.899563 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8"] Mar 07 04:24:06 crc kubenswrapper[4689]: E0307 04:24:06.899882 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a94bd2-f479-403b-9c36-a708410864aa" containerName="oc" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.899899 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a94bd2-f479-403b-9c36-a708410864aa" containerName="oc" Mar 07 04:24:06 crc kubenswrapper[4689]: E0307 04:24:06.899907 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98532b6-658d-41de-8e97-0f941ad34251" containerName="oc" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.899913 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98532b6-658d-41de-8e97-0f941ad34251" containerName="oc" Mar 07 04:24:06 crc kubenswrapper[4689]: E0307 04:24:06.899924 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c8a0b4-594d-4c2d-80e5-8dc6920959a6" containerName="route-controller-manager" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.899930 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c8a0b4-594d-4c2d-80e5-8dc6920959a6" containerName="route-controller-manager" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.900021 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98532b6-658d-41de-8e97-0f941ad34251" containerName="oc" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.900041 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="33a94bd2-f479-403b-9c36-a708410864aa" containerName="oc" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.900053 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c8a0b4-594d-4c2d-80e5-8dc6920959a6" containerName="route-controller-manager" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.900512 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.914985 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8"] Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.922296 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.922350 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.922769 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhjf4\" (UniqueName: \"kubernetes.io/projected/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-kube-api-access-lhjf4\") pod \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.922798 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-config\") pod \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.922822 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-client-ca\") pod \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.922844 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-serving-cert\") pod \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\" (UID: \"41c8a0b4-594d-4c2d-80e5-8dc6920959a6\") " Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.924963 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-client-ca" (OuterVolumeSpecName: "client-ca") pod "41c8a0b4-594d-4c2d-80e5-8dc6920959a6" (UID: "41c8a0b4-594d-4c2d-80e5-8dc6920959a6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.925047 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-config" (OuterVolumeSpecName: "config") pod "41c8a0b4-594d-4c2d-80e5-8dc6920959a6" (UID: "41c8a0b4-594d-4c2d-80e5-8dc6920959a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.936841 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "41c8a0b4-594d-4c2d-80e5-8dc6920959a6" (UID: "41c8a0b4-594d-4c2d-80e5-8dc6920959a6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:24:06 crc kubenswrapper[4689]: I0307 04:24:06.937971 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-kube-api-access-lhjf4" (OuterVolumeSpecName: "kube-api-access-lhjf4") pod "41c8a0b4-594d-4c2d-80e5-8dc6920959a6" (UID: "41c8a0b4-594d-4c2d-80e5-8dc6920959a6"). InnerVolumeSpecName "kube-api-access-lhjf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.023974 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-config\") pod \"route-controller-manager-5b5754d5c7-fzdv8\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.024026 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-client-ca\") pod \"route-controller-manager-5b5754d5c7-fzdv8\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.024048 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xm26\" (UniqueName: \"kubernetes.io/projected/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-kube-api-access-5xm26\") pod \"route-controller-manager-5b5754d5c7-fzdv8\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.024099 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-serving-cert\") pod \"route-controller-manager-5b5754d5c7-fzdv8\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.024152 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.024162 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.024184 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhjf4\" (UniqueName: \"kubernetes.io/projected/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-kube-api-access-lhjf4\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.024193 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c8a0b4-594d-4c2d-80e5-8dc6920959a6-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.049094 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.092942 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" event={"ID":"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97","Type":"ContainerDied","Data":"66630a5fc8b080e711b9d6067e12f9b38b4b7b06387b71633020449c2dd348ea"} Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.092995 4689 scope.go:117] "RemoveContainer" containerID="cc36d7839515456fff41aa54e4dee9e0f91ffcef5480252e04939dc471411049" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.093089 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.100629 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" event={"ID":"41c8a0b4-594d-4c2d-80e5-8dc6920959a6","Type":"ContainerDied","Data":"c1a7bd507fa1f0571d2d84c3aecde20e6fac80b0997b518661dbf99fa8a4ad4d"} Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.100824 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.114964 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.115007 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.123535 4689 scope.go:117] "RemoveContainer" containerID="08dd80ecc252a31f24f6520b27a2080bb6597ec42c26777be1e2f293f618d852" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.127303 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p6rb\" (UniqueName: \"kubernetes.io/projected/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-kube-api-access-4p6rb\") pod \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.127396 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-serving-cert\") pod \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.127429 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-client-ca\") pod \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.127461 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-config\") pod \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.127483 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-proxy-ca-bundles\") pod \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\" (UID: \"0d82a6b2-6b23-4036-8eb5-fb6b1366bc97\") " Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.127611 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-client-ca\") pod \"route-controller-manager-5b5754d5c7-fzdv8\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.127642 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xm26\" (UniqueName: \"kubernetes.io/projected/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-kube-api-access-5xm26\") pod \"route-controller-manager-5b5754d5c7-fzdv8\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.127707 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-serving-cert\") pod \"route-controller-manager-5b5754d5c7-fzdv8\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.127761 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-config\") pod \"route-controller-manager-5b5754d5c7-fzdv8\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.128832 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-config\") pod \"route-controller-manager-5b5754d5c7-fzdv8\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.132554 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0d82a6b2-6b23-4036-8eb5-fb6b1366bc97" (UID: "0d82a6b2-6b23-4036-8eb5-fb6b1366bc97"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.132903 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-client-ca\") pod \"route-controller-manager-5b5754d5c7-fzdv8\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.132987 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-client-ca" (OuterVolumeSpecName: "client-ca") pod "0d82a6b2-6b23-4036-8eb5-fb6b1366bc97" (UID: "0d82a6b2-6b23-4036-8eb5-fb6b1366bc97"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.133035 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-config" (OuterVolumeSpecName: "config") pod "0d82a6b2-6b23-4036-8eb5-fb6b1366bc97" (UID: "0d82a6b2-6b23-4036-8eb5-fb6b1366bc97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.133689 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-serving-cert\") pod \"route-controller-manager-5b5754d5c7-fzdv8\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.135904 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp"] Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.150501 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-kube-api-access-4p6rb" (OuterVolumeSpecName: "kube-api-access-4p6rb") pod "0d82a6b2-6b23-4036-8eb5-fb6b1366bc97" (UID: "0d82a6b2-6b23-4036-8eb5-fb6b1366bc97"). InnerVolumeSpecName "kube-api-access-4p6rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.152480 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b6787b7dd-k2ccp"] Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.152516 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xm26\" (UniqueName: \"kubernetes.io/projected/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-kube-api-access-5xm26\") pod \"route-controller-manager-5b5754d5c7-fzdv8\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.167386 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0d82a6b2-6b23-4036-8eb5-fb6b1366bc97" (UID: "0d82a6b2-6b23-4036-8eb5-fb6b1366bc97"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.218999 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.228743 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.228779 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.228789 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.228799 4689 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.228813 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p6rb\" (UniqueName: \"kubernetes.io/projected/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97-kube-api-access-4p6rb\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.416774 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl"] Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.425388 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-76ddcc5b56-qfvbl"] Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.436883 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8"] Mar 07 04:24:07 crc kubenswrapper[4689]: W0307 04:24:07.457373 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07e119d7_d5d5_4f2c_b4fa_4f0563498a57.slice/crio-93cf48f2e1c05fca24d8f94dc57c3ac6eedb6a88b0db5e12dc72b8bc5c51d93c WatchSource:0}: Error finding container 93cf48f2e1c05fca24d8f94dc57c3ac6eedb6a88b0db5e12dc72b8bc5c51d93c: Status 404 returned error can't find the container with id 93cf48f2e1c05fca24d8f94dc57c3ac6eedb6a88b0db5e12dc72b8bc5c51d93c Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.658804 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-fmghp" podUID="c82c3040-48ed-473b-9386-d58d13364f29" containerName="registry-server" probeResult="failure" output=< Mar 07 04:24:07 crc kubenswrapper[4689]: timeout: failed to connect service ":50051" within 1s Mar 07 04:24:07 crc kubenswrapper[4689]: > Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.686658 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hvrwc" podUID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" containerName="registry-server" probeResult="failure" output=< Mar 07 04:24:07 crc kubenswrapper[4689]: timeout: failed to connect service ":50051" within 1s Mar 07 04:24:07 crc kubenswrapper[4689]: > Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.853692 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d82a6b2-6b23-4036-8eb5-fb6b1366bc97" path="/var/lib/kubelet/pods/0d82a6b2-6b23-4036-8eb5-fb6b1366bc97/volumes" Mar 07 04:24:07 crc kubenswrapper[4689]: I0307 04:24:07.855413 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c8a0b4-594d-4c2d-80e5-8dc6920959a6" path="/var/lib/kubelet/pods/41c8a0b4-594d-4c2d-80e5-8dc6920959a6/volumes" Mar 07 04:24:08 crc kubenswrapper[4689]: I0307 04:24:08.003503 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-chw2s" podUID="99bbfad4-6baf-4ada-88b8-158f49957da5" containerName="registry-server" probeResult="failure" output=< Mar 07 04:24:08 crc kubenswrapper[4689]: timeout: failed to connect service ":50051" within 1s Mar 07 04:24:08 crc kubenswrapper[4689]: > Mar 07 04:24:08 crc kubenswrapper[4689]: I0307 04:24:08.110679 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" event={"ID":"07e119d7-d5d5-4f2c-b4fa-4f0563498a57","Type":"ContainerStarted","Data":"074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929"} Mar 07 04:24:08 crc kubenswrapper[4689]: I0307 04:24:08.110744 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" event={"ID":"07e119d7-d5d5-4f2c-b4fa-4f0563498a57","Type":"ContainerStarted","Data":"93cf48f2e1c05fca24d8f94dc57c3ac6eedb6a88b0db5e12dc72b8bc5c51d93c"} Mar 07 04:24:08 crc kubenswrapper[4689]: I0307 04:24:08.111059 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:08 crc kubenswrapper[4689]: I0307 04:24:08.132118 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" podStartSLOduration=3.132094794 podStartE2EDuration="3.132094794s" podCreationTimestamp="2026-03-07 04:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:24:08.129104253 +0000 UTC m=+293.175487742" watchObservedRunningTime="2026-03-07 04:24:08.132094794 +0000 UTC m=+293.178478293" Mar 07 04:24:08 crc kubenswrapper[4689]: I0307 04:24:08.179009 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-gc2hb" podUID="d4a365d2-d74f-4675-b789-27bafa93fbff" containerName="registry-server" probeResult="failure" output=< Mar 07 04:24:08 crc kubenswrapper[4689]: timeout: failed to connect service ":50051" within 1s Mar 07 04:24:08 crc kubenswrapper[4689]: > Mar 07 04:24:08 crc kubenswrapper[4689]: I0307 04:24:08.210326 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:08 crc kubenswrapper[4689]: I0307 04:24:08.549631 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:24:08 crc kubenswrapper[4689]: I0307 04:24:08.549691 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:24:08 crc kubenswrapper[4689]: I0307 04:24:08.673052 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.016920 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.017398 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.091410 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.186530 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.188558 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.299816 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-866fb5bc59-g9s9z"] Mar 07 04:24:09 crc kubenswrapper[4689]: E0307 04:24:09.300725 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d82a6b2-6b23-4036-8eb5-fb6b1366bc97" containerName="controller-manager" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.300751 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d82a6b2-6b23-4036-8eb5-fb6b1366bc97" containerName="controller-manager" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.300878 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d82a6b2-6b23-4036-8eb5-fb6b1366bc97" containerName="controller-manager" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.301680 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.307993 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.308014 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.311531 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.312159 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.312260 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.312509 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.315565 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-866fb5bc59-g9s9z"] Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.318686 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.357491 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwr4f\" (UniqueName: \"kubernetes.io/projected/d730698e-8fcf-4823-91a4-580e9fbbf6bd-kube-api-access-vwr4f\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.357552 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d730698e-8fcf-4823-91a4-580e9fbbf6bd-serving-cert\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.357649 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-client-ca\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.357674 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-proxy-ca-bundles\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.357725 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-config\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.458695 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d730698e-8fcf-4823-91a4-580e9fbbf6bd-serving-cert\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.458764 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-client-ca\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.458783 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-proxy-ca-bundles\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.458822 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-config\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.458857 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwr4f\" (UniqueName: \"kubernetes.io/projected/d730698e-8fcf-4823-91a4-580e9fbbf6bd-kube-api-access-vwr4f\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.460946 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-client-ca\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.461108 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-proxy-ca-bundles\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.461652 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-config\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.467243 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d730698e-8fcf-4823-91a4-580e9fbbf6bd-serving-cert\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.478906 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwr4f\" (UniqueName: \"kubernetes.io/projected/d730698e-8fcf-4823-91a4-580e9fbbf6bd-kube-api-access-vwr4f\") pod \"controller-manager-866fb5bc59-g9s9z\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.618327 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.633696 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.633748 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:24:09 crc kubenswrapper[4689]: I0307 04:24:09.860080 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-866fb5bc59-g9s9z"] Mar 07 04:24:10 crc kubenswrapper[4689]: I0307 04:24:10.055736 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:24:10 crc kubenswrapper[4689]: I0307 04:24:10.056258 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:24:10 crc kubenswrapper[4689]: I0307 04:24:10.130222 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" event={"ID":"d730698e-8fcf-4823-91a4-580e9fbbf6bd","Type":"ContainerStarted","Data":"9b7e2f17f7bc0fec354cd95fe42ad78374c768e0f70a2f0310d04ff319688978"} Mar 07 04:24:10 crc kubenswrapper[4689]: I0307 04:24:10.675865 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2wh2s" podUID="98a53e64-9323-454c-9de0-a8d348182a64" containerName="registry-server" probeResult="failure" output=< Mar 07 04:24:10 crc kubenswrapper[4689]: timeout: failed to connect service ":50051" within 1s Mar 07 04:24:10 crc kubenswrapper[4689]: > Mar 07 04:24:11 crc kubenswrapper[4689]: I0307 04:24:11.101392 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5fhcx" podUID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" containerName="registry-server" probeResult="failure" output=< Mar 07 04:24:11 crc kubenswrapper[4689]: timeout: failed to connect service ":50051" within 1s Mar 07 04:24:11 crc kubenswrapper[4689]: > Mar 07 04:24:11 crc kubenswrapper[4689]: I0307 04:24:11.138669 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" event={"ID":"d730698e-8fcf-4823-91a4-580e9fbbf6bd","Type":"ContainerStarted","Data":"5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1"} Mar 07 04:24:11 crc kubenswrapper[4689]: I0307 04:24:11.139598 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:11 crc kubenswrapper[4689]: I0307 04:24:11.146899 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:11 crc kubenswrapper[4689]: I0307 04:24:11.171576 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" podStartSLOduration=6.171551779 podStartE2EDuration="6.171551779s" podCreationTimestamp="2026-03-07 04:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:24:11.166976056 +0000 UTC m=+296.213359565" watchObservedRunningTime="2026-03-07 04:24:11.171551779 +0000 UTC m=+296.217935308" Mar 07 04:24:11 crc kubenswrapper[4689]: I0307 04:24:11.277822 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-72r56"] Mar 07 04:24:11 crc kubenswrapper[4689]: I0307 04:24:11.278079 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-72r56" podUID="18622abe-0dae-4a1b-83b8-8314bf342ccc" containerName="registry-server" containerID="cri-o://5d3ec338aa5763b8677b48433aecdd0ae0b4cb67ca8727d892f44eb8dfc55a84" gracePeriod=2 Mar 07 04:24:12 crc kubenswrapper[4689]: I0307 04:24:12.152880 4689 generic.go:334] "Generic (PLEG): container finished" podID="18622abe-0dae-4a1b-83b8-8314bf342ccc" containerID="5d3ec338aa5763b8677b48433aecdd0ae0b4cb67ca8727d892f44eb8dfc55a84" exitCode=0 Mar 07 04:24:12 crc kubenswrapper[4689]: I0307 04:24:12.153000 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72r56" event={"ID":"18622abe-0dae-4a1b-83b8-8314bf342ccc","Type":"ContainerDied","Data":"5d3ec338aa5763b8677b48433aecdd0ae0b4cb67ca8727d892f44eb8dfc55a84"} Mar 07 04:24:12 crc kubenswrapper[4689]: I0307 04:24:12.472974 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:24:12 crc kubenswrapper[4689]: I0307 04:24:12.547874 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp2xw\" (UniqueName: \"kubernetes.io/projected/18622abe-0dae-4a1b-83b8-8314bf342ccc-kube-api-access-xp2xw\") pod \"18622abe-0dae-4a1b-83b8-8314bf342ccc\" (UID: \"18622abe-0dae-4a1b-83b8-8314bf342ccc\") " Mar 07 04:24:12 crc kubenswrapper[4689]: I0307 04:24:12.548020 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18622abe-0dae-4a1b-83b8-8314bf342ccc-utilities\") pod \"18622abe-0dae-4a1b-83b8-8314bf342ccc\" (UID: \"18622abe-0dae-4a1b-83b8-8314bf342ccc\") " Mar 07 04:24:12 crc kubenswrapper[4689]: I0307 04:24:12.548038 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18622abe-0dae-4a1b-83b8-8314bf342ccc-catalog-content\") pod \"18622abe-0dae-4a1b-83b8-8314bf342ccc\" (UID: \"18622abe-0dae-4a1b-83b8-8314bf342ccc\") " Mar 07 04:24:12 crc kubenswrapper[4689]: I0307 04:24:12.549088 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18622abe-0dae-4a1b-83b8-8314bf342ccc-utilities" (OuterVolumeSpecName: "utilities") pod "18622abe-0dae-4a1b-83b8-8314bf342ccc" (UID: "18622abe-0dae-4a1b-83b8-8314bf342ccc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:24:12 crc kubenswrapper[4689]: I0307 04:24:12.557368 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18622abe-0dae-4a1b-83b8-8314bf342ccc-kube-api-access-xp2xw" (OuterVolumeSpecName: "kube-api-access-xp2xw") pod "18622abe-0dae-4a1b-83b8-8314bf342ccc" (UID: "18622abe-0dae-4a1b-83b8-8314bf342ccc"). InnerVolumeSpecName "kube-api-access-xp2xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:24:12 crc kubenswrapper[4689]: I0307 04:24:12.584959 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18622abe-0dae-4a1b-83b8-8314bf342ccc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18622abe-0dae-4a1b-83b8-8314bf342ccc" (UID: "18622abe-0dae-4a1b-83b8-8314bf342ccc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:24:12 crc kubenswrapper[4689]: I0307 04:24:12.649454 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp2xw\" (UniqueName: \"kubernetes.io/projected/18622abe-0dae-4a1b-83b8-8314bf342ccc-kube-api-access-xp2xw\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:12 crc kubenswrapper[4689]: I0307 04:24:12.649506 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18622abe-0dae-4a1b-83b8-8314bf342ccc-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:12 crc kubenswrapper[4689]: I0307 04:24:12.649519 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18622abe-0dae-4a1b-83b8-8314bf342ccc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:13 crc kubenswrapper[4689]: I0307 04:24:13.161220 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72r56" Mar 07 04:24:13 crc kubenswrapper[4689]: I0307 04:24:13.161229 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72r56" event={"ID":"18622abe-0dae-4a1b-83b8-8314bf342ccc","Type":"ContainerDied","Data":"bc50de807dd4ed87fc796639e8e02d6a9441f37dc221c658fdef9c162750ed3d"} Mar 07 04:24:13 crc kubenswrapper[4689]: I0307 04:24:13.161375 4689 scope.go:117] "RemoveContainer" containerID="5d3ec338aa5763b8677b48433aecdd0ae0b4cb67ca8727d892f44eb8dfc55a84" Mar 07 04:24:13 crc kubenswrapper[4689]: I0307 04:24:13.192862 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-72r56"] Mar 07 04:24:13 crc kubenswrapper[4689]: I0307 04:24:13.202776 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-72r56"] Mar 07 04:24:13 crc kubenswrapper[4689]: I0307 04:24:13.207045 4689 scope.go:117] "RemoveContainer" containerID="7ab433c9d3b663b61fde835084897dfe5badd525c40323e2c79e8aca26a854f7" Mar 07 04:24:13 crc kubenswrapper[4689]: I0307 04:24:13.232298 4689 scope.go:117] "RemoveContainer" containerID="ee408d1b8417cd40313871a28ad8ef262987426ebc665f9b7bc4ebbe63cdd2af" Mar 07 04:24:13 crc kubenswrapper[4689]: I0307 04:24:13.853706 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18622abe-0dae-4a1b-83b8-8314bf342ccc" path="/var/lib/kubelet/pods/18622abe-0dae-4a1b-83b8-8314bf342ccc/volumes" Mar 07 04:24:16 crc kubenswrapper[4689]: I0307 04:24:16.489688 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:24:16 crc kubenswrapper[4689]: I0307 04:24:16.561879 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:24:16 crc kubenswrapper[4689]: I0307 04:24:16.713573 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:24:16 crc kubenswrapper[4689]: I0307 04:24:16.784433 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:24:16 crc kubenswrapper[4689]: I0307 04:24:16.985459 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:24:17 crc kubenswrapper[4689]: I0307 04:24:17.034807 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:24:17 crc kubenswrapper[4689]: I0307 04:24:17.166155 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:24:17 crc kubenswrapper[4689]: I0307 04:24:17.238887 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:24:17 crc kubenswrapper[4689]: I0307 04:24:17.679190 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-chw2s"] Mar 07 04:24:18 crc kubenswrapper[4689]: I0307 04:24:18.199627 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-chw2s" podUID="99bbfad4-6baf-4ada-88b8-158f49957da5" containerName="registry-server" containerID="cri-o://ba5f0a13bfa0baabfe33d32c22a2462159f366c9bf71c75679855771d7da1d48" gracePeriod=2 Mar 07 04:24:19 crc kubenswrapper[4689]: I0307 04:24:19.085966 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gc2hb"] Mar 07 04:24:19 crc kubenswrapper[4689]: I0307 04:24:19.086944 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gc2hb" podUID="d4a365d2-d74f-4675-b789-27bafa93fbff" containerName="registry-server" containerID="cri-o://891e3d2d19f1f2030d54aeeb15c5472652e84fd8d62eb840acd78c4b67b9f36f" gracePeriod=2 Mar 07 04:24:19 crc kubenswrapper[4689]: I0307 04:24:19.207828 4689 generic.go:334] "Generic (PLEG): container finished" podID="99bbfad4-6baf-4ada-88b8-158f49957da5" containerID="ba5f0a13bfa0baabfe33d32c22a2462159f366c9bf71c75679855771d7da1d48" exitCode=0 Mar 07 04:24:19 crc kubenswrapper[4689]: I0307 04:24:19.207909 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chw2s" event={"ID":"99bbfad4-6baf-4ada-88b8-158f49957da5","Type":"ContainerDied","Data":"ba5f0a13bfa0baabfe33d32c22a2462159f366c9bf71c75679855771d7da1d48"} Mar 07 04:24:19 crc kubenswrapper[4689]: I0307 04:24:19.699896 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:24:19 crc kubenswrapper[4689]: I0307 04:24:19.756539 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.025501 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.111011 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.155256 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.173995 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfn55\" (UniqueName: \"kubernetes.io/projected/99bbfad4-6baf-4ada-88b8-158f49957da5-kube-api-access-tfn55\") pod \"99bbfad4-6baf-4ada-88b8-158f49957da5\" (UID: \"99bbfad4-6baf-4ada-88b8-158f49957da5\") " Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.174054 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99bbfad4-6baf-4ada-88b8-158f49957da5-utilities\") pod \"99bbfad4-6baf-4ada-88b8-158f49957da5\" (UID: \"99bbfad4-6baf-4ada-88b8-158f49957da5\") " Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.174162 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99bbfad4-6baf-4ada-88b8-158f49957da5-catalog-content\") pod \"99bbfad4-6baf-4ada-88b8-158f49957da5\" (UID: \"99bbfad4-6baf-4ada-88b8-158f49957da5\") " Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.175187 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99bbfad4-6baf-4ada-88b8-158f49957da5-utilities" (OuterVolumeSpecName: "utilities") pod "99bbfad4-6baf-4ada-88b8-158f49957da5" (UID: "99bbfad4-6baf-4ada-88b8-158f49957da5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.180396 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99bbfad4-6baf-4ada-88b8-158f49957da5-kube-api-access-tfn55" (OuterVolumeSpecName: "kube-api-access-tfn55") pod "99bbfad4-6baf-4ada-88b8-158f49957da5" (UID: "99bbfad4-6baf-4ada-88b8-158f49957da5"). InnerVolumeSpecName "kube-api-access-tfn55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.216762 4689 generic.go:334] "Generic (PLEG): container finished" podID="d4a365d2-d74f-4675-b789-27bafa93fbff" containerID="891e3d2d19f1f2030d54aeeb15c5472652e84fd8d62eb840acd78c4b67b9f36f" exitCode=0 Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.216852 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gc2hb" event={"ID":"d4a365d2-d74f-4675-b789-27bafa93fbff","Type":"ContainerDied","Data":"891e3d2d19f1f2030d54aeeb15c5472652e84fd8d62eb840acd78c4b67b9f36f"} Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.220823 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chw2s" event={"ID":"99bbfad4-6baf-4ada-88b8-158f49957da5","Type":"ContainerDied","Data":"3538b87fae878bdd5727fcccb1b1f79c960f83b3fcd0fdf7acd97bbe8402b3ef"} Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.220928 4689 scope.go:117] "RemoveContainer" containerID="ba5f0a13bfa0baabfe33d32c22a2462159f366c9bf71c75679855771d7da1d48" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.221089 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-chw2s" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.227557 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99bbfad4-6baf-4ada-88b8-158f49957da5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99bbfad4-6baf-4ada-88b8-158f49957da5" (UID: "99bbfad4-6baf-4ada-88b8-158f49957da5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.237403 4689 scope.go:117] "RemoveContainer" containerID="2119b807ba910329fe2bcb1c84fc119e4e511acab92fd0b5a963ba6a463b0e0e" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.251139 4689 scope.go:117] "RemoveContainer" containerID="f6adef77b539dc706907b15d2309dc087fd0c7ce7b432e9f28b580d81736de27" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.276248 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99bbfad4-6baf-4ada-88b8-158f49957da5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.276277 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfn55\" (UniqueName: \"kubernetes.io/projected/99bbfad4-6baf-4ada-88b8-158f49957da5-kube-api-access-tfn55\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.276289 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99bbfad4-6baf-4ada-88b8-158f49957da5-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.549872 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-chw2s"] Mar 07 04:24:20 crc kubenswrapper[4689]: I0307 04:24:20.552363 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-chw2s"] Mar 07 04:24:21 crc kubenswrapper[4689]: I0307 04:24:21.469542 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:24:21 crc kubenswrapper[4689]: I0307 04:24:21.593673 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a365d2-d74f-4675-b789-27bafa93fbff-utilities\") pod \"d4a365d2-d74f-4675-b789-27bafa93fbff\" (UID: \"d4a365d2-d74f-4675-b789-27bafa93fbff\") " Mar 07 04:24:21 crc kubenswrapper[4689]: I0307 04:24:21.593724 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a365d2-d74f-4675-b789-27bafa93fbff-catalog-content\") pod \"d4a365d2-d74f-4675-b789-27bafa93fbff\" (UID: \"d4a365d2-d74f-4675-b789-27bafa93fbff\") " Mar 07 04:24:21 crc kubenswrapper[4689]: I0307 04:24:21.593785 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxh66\" (UniqueName: \"kubernetes.io/projected/d4a365d2-d74f-4675-b789-27bafa93fbff-kube-api-access-kxh66\") pod \"d4a365d2-d74f-4675-b789-27bafa93fbff\" (UID: \"d4a365d2-d74f-4675-b789-27bafa93fbff\") " Mar 07 04:24:21 crc kubenswrapper[4689]: I0307 04:24:21.594897 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a365d2-d74f-4675-b789-27bafa93fbff-utilities" (OuterVolumeSpecName: "utilities") pod "d4a365d2-d74f-4675-b789-27bafa93fbff" (UID: "d4a365d2-d74f-4675-b789-27bafa93fbff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:24:21 crc kubenswrapper[4689]: I0307 04:24:21.599973 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a365d2-d74f-4675-b789-27bafa93fbff-kube-api-access-kxh66" (OuterVolumeSpecName: "kube-api-access-kxh66") pod "d4a365d2-d74f-4675-b789-27bafa93fbff" (UID: "d4a365d2-d74f-4675-b789-27bafa93fbff"). InnerVolumeSpecName "kube-api-access-kxh66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:24:21 crc kubenswrapper[4689]: I0307 04:24:21.651966 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a365d2-d74f-4675-b789-27bafa93fbff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4a365d2-d74f-4675-b789-27bafa93fbff" (UID: "d4a365d2-d74f-4675-b789-27bafa93fbff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:24:21 crc kubenswrapper[4689]: I0307 04:24:21.695019 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a365d2-d74f-4675-b789-27bafa93fbff-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:21 crc kubenswrapper[4689]: I0307 04:24:21.695065 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a365d2-d74f-4675-b789-27bafa93fbff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:21 crc kubenswrapper[4689]: I0307 04:24:21.695082 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxh66\" (UniqueName: \"kubernetes.io/projected/d4a365d2-d74f-4675-b789-27bafa93fbff-kube-api-access-kxh66\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:21 crc kubenswrapper[4689]: I0307 04:24:21.832592 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99bbfad4-6baf-4ada-88b8-158f49957da5" path="/var/lib/kubelet/pods/99bbfad4-6baf-4ada-88b8-158f49957da5/volumes" Mar 07 04:24:22 crc kubenswrapper[4689]: I0307 04:24:22.234291 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gc2hb" event={"ID":"d4a365d2-d74f-4675-b789-27bafa93fbff","Type":"ContainerDied","Data":"1e80c58b90f87062395443f36ee703eab04f03f064addb6586c8934f7e8c6957"} Mar 07 04:24:22 crc kubenswrapper[4689]: I0307 04:24:22.234359 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gc2hb" Mar 07 04:24:22 crc kubenswrapper[4689]: I0307 04:24:22.234392 4689 scope.go:117] "RemoveContainer" containerID="891e3d2d19f1f2030d54aeeb15c5472652e84fd8d62eb840acd78c4b67b9f36f" Mar 07 04:24:22 crc kubenswrapper[4689]: I0307 04:24:22.251849 4689 scope.go:117] "RemoveContainer" containerID="759bcb7508f2585a20d2e984948abd716013456d87e2418a66f10cb6cb385205" Mar 07 04:24:22 crc kubenswrapper[4689]: I0307 04:24:22.252202 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gc2hb"] Mar 07 04:24:22 crc kubenswrapper[4689]: I0307 04:24:22.261281 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gc2hb"] Mar 07 04:24:22 crc kubenswrapper[4689]: I0307 04:24:22.269654 4689 scope.go:117] "RemoveContainer" containerID="4f336cd13b96da657bb9c0b1073a8059dd29fba54ce5443167766adcbf8d3b49" Mar 07 04:24:22 crc kubenswrapper[4689]: I0307 04:24:22.477146 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fhcx"] Mar 07 04:24:22 crc kubenswrapper[4689]: I0307 04:24:22.477387 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5fhcx" podUID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" containerName="registry-server" containerID="cri-o://0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215" gracePeriod=2 Mar 07 04:24:22 crc kubenswrapper[4689]: I0307 04:24:22.960008 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.112380 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84afefb-ca8f-4586-a7bc-6d733cb723b1-catalog-content\") pod \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\" (UID: \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\") " Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.112430 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84afefb-ca8f-4586-a7bc-6d733cb723b1-utilities\") pod \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\" (UID: \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\") " Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.112477 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kfv4\" (UniqueName: \"kubernetes.io/projected/b84afefb-ca8f-4586-a7bc-6d733cb723b1-kube-api-access-2kfv4\") pod \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\" (UID: \"b84afefb-ca8f-4586-a7bc-6d733cb723b1\") " Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.113672 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84afefb-ca8f-4586-a7bc-6d733cb723b1-utilities" (OuterVolumeSpecName: "utilities") pod "b84afefb-ca8f-4586-a7bc-6d733cb723b1" (UID: "b84afefb-ca8f-4586-a7bc-6d733cb723b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.117255 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b84afefb-ca8f-4586-a7bc-6d733cb723b1-kube-api-access-2kfv4" (OuterVolumeSpecName: "kube-api-access-2kfv4") pod "b84afefb-ca8f-4586-a7bc-6d733cb723b1" (UID: "b84afefb-ca8f-4586-a7bc-6d733cb723b1"). InnerVolumeSpecName "kube-api-access-2kfv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.213542 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84afefb-ca8f-4586-a7bc-6d733cb723b1-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.213582 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kfv4\" (UniqueName: \"kubernetes.io/projected/b84afefb-ca8f-4586-a7bc-6d733cb723b1-kube-api-access-2kfv4\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.241279 4689 generic.go:334] "Generic (PLEG): container finished" podID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" containerID="0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215" exitCode=0 Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.241348 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fhcx" event={"ID":"b84afefb-ca8f-4586-a7bc-6d733cb723b1","Type":"ContainerDied","Data":"0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215"} Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.241380 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fhcx" event={"ID":"b84afefb-ca8f-4586-a7bc-6d733cb723b1","Type":"ContainerDied","Data":"2fbcb8a08e1d31015539a5f486d69ebfc2f2531ddbf2cabf4650a5d65534301b"} Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.241405 4689 scope.go:117] "RemoveContainer" containerID="0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.241514 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fhcx" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.241999 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84afefb-ca8f-4586-a7bc-6d733cb723b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b84afefb-ca8f-4586-a7bc-6d733cb723b1" (UID: "b84afefb-ca8f-4586-a7bc-6d733cb723b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.261742 4689 scope.go:117] "RemoveContainer" containerID="4be73f5f6dcfed4bf698cb70894ba2d78eb52e81facd27c5142834df5978631e" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.279539 4689 scope.go:117] "RemoveContainer" containerID="41c587ecc10eca9dd4cbe0eef1bb567d64fee74a46374564312a04497e0f19a7" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.293949 4689 scope.go:117] "RemoveContainer" containerID="0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215" Mar 07 04:24:23 crc kubenswrapper[4689]: E0307 04:24:23.294633 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215\": container with ID starting with 0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215 not found: ID does not exist" containerID="0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.294704 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215"} err="failed to get container status \"0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215\": rpc error: code = NotFound desc = could not find container \"0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215\": container with ID starting with 0aa9cf73419786c4d42a0c3d600359ec8ec6d350b15c363736aea9ef36a33215 not found: ID does not exist" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.294736 4689 scope.go:117] "RemoveContainer" containerID="4be73f5f6dcfed4bf698cb70894ba2d78eb52e81facd27c5142834df5978631e" Mar 07 04:24:23 crc kubenswrapper[4689]: E0307 04:24:23.295053 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be73f5f6dcfed4bf698cb70894ba2d78eb52e81facd27c5142834df5978631e\": container with ID starting with 4be73f5f6dcfed4bf698cb70894ba2d78eb52e81facd27c5142834df5978631e not found: ID does not exist" containerID="4be73f5f6dcfed4bf698cb70894ba2d78eb52e81facd27c5142834df5978631e" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.295105 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be73f5f6dcfed4bf698cb70894ba2d78eb52e81facd27c5142834df5978631e"} err="failed to get container status \"4be73f5f6dcfed4bf698cb70894ba2d78eb52e81facd27c5142834df5978631e\": rpc error: code = NotFound desc = could not find container \"4be73f5f6dcfed4bf698cb70894ba2d78eb52e81facd27c5142834df5978631e\": container with ID starting with 4be73f5f6dcfed4bf698cb70894ba2d78eb52e81facd27c5142834df5978631e not found: ID does not exist" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.295125 4689 scope.go:117] "RemoveContainer" containerID="41c587ecc10eca9dd4cbe0eef1bb567d64fee74a46374564312a04497e0f19a7" Mar 07 04:24:23 crc kubenswrapper[4689]: E0307 04:24:23.295608 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c587ecc10eca9dd4cbe0eef1bb567d64fee74a46374564312a04497e0f19a7\": container with ID starting with 41c587ecc10eca9dd4cbe0eef1bb567d64fee74a46374564312a04497e0f19a7 not found: ID does not exist" containerID="41c587ecc10eca9dd4cbe0eef1bb567d64fee74a46374564312a04497e0f19a7" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.295638 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c587ecc10eca9dd4cbe0eef1bb567d64fee74a46374564312a04497e0f19a7"} err="failed to get container status \"41c587ecc10eca9dd4cbe0eef1bb567d64fee74a46374564312a04497e0f19a7\": rpc error: code = NotFound desc = could not find container \"41c587ecc10eca9dd4cbe0eef1bb567d64fee74a46374564312a04497e0f19a7\": container with ID starting with 41c587ecc10eca9dd4cbe0eef1bb567d64fee74a46374564312a04497e0f19a7 not found: ID does not exist" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.314529 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84afefb-ca8f-4586-a7bc-6d733cb723b1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.566453 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fhcx"] Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.577967 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5fhcx"] Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.834689 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" path="/var/lib/kubelet/pods/b84afefb-ca8f-4586-a7bc-6d733cb723b1/volumes" Mar 07 04:24:23 crc kubenswrapper[4689]: I0307 04:24:23.835431 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4a365d2-d74f-4675-b789-27bafa93fbff" path="/var/lib/kubelet/pods/d4a365d2-d74f-4675-b789-27bafa93fbff/volumes" Mar 07 04:24:25 crc kubenswrapper[4689]: I0307 04:24:25.643515 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-866fb5bc59-g9s9z"] Mar 07 04:24:25 crc kubenswrapper[4689]: I0307 04:24:25.643759 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" podUID="d730698e-8fcf-4823-91a4-580e9fbbf6bd" containerName="controller-manager" containerID="cri-o://5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1" gracePeriod=30 Mar 07 04:24:25 crc kubenswrapper[4689]: I0307 04:24:25.732184 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8"] Mar 07 04:24:25 crc kubenswrapper[4689]: I0307 04:24:25.732432 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" podUID="07e119d7-d5d5-4f2c-b4fa-4f0563498a57" containerName="route-controller-manager" containerID="cri-o://074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929" gracePeriod=30 Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.236878 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.241297 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.262205 4689 generic.go:334] "Generic (PLEG): container finished" podID="07e119d7-d5d5-4f2c-b4fa-4f0563498a57" containerID="074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929" exitCode=0 Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.262270 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.262282 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" event={"ID":"07e119d7-d5d5-4f2c-b4fa-4f0563498a57","Type":"ContainerDied","Data":"074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929"} Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.262328 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8" event={"ID":"07e119d7-d5d5-4f2c-b4fa-4f0563498a57","Type":"ContainerDied","Data":"93cf48f2e1c05fca24d8f94dc57c3ac6eedb6a88b0db5e12dc72b8bc5c51d93c"} Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.262348 4689 scope.go:117] "RemoveContainer" containerID="074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.263692 4689 generic.go:334] "Generic (PLEG): container finished" podID="d730698e-8fcf-4823-91a4-580e9fbbf6bd" containerID="5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1" exitCode=0 Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.263719 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" event={"ID":"d730698e-8fcf-4823-91a4-580e9fbbf6bd","Type":"ContainerDied","Data":"5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1"} Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.263740 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" event={"ID":"d730698e-8fcf-4823-91a4-580e9fbbf6bd","Type":"ContainerDied","Data":"9b7e2f17f7bc0fec354cd95fe42ad78374c768e0f70a2f0310d04ff319688978"} Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.263744 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866fb5bc59-g9s9z" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.284281 4689 scope.go:117] "RemoveContainer" containerID="074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929" Mar 07 04:24:26 crc kubenswrapper[4689]: E0307 04:24:26.284788 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929\": container with ID starting with 074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929 not found: ID does not exist" containerID="074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.284841 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929"} err="failed to get container status \"074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929\": rpc error: code = NotFound desc = could not find container \"074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929\": container with ID starting with 074f76a804a2dd6e50ac5f66fa99d957d3d026f12ae9be0125eb2d98fda08929 not found: ID does not exist" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.284876 4689 scope.go:117] "RemoveContainer" containerID="5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.297444 4689 scope.go:117] "RemoveContainer" containerID="5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1" Mar 07 04:24:26 crc kubenswrapper[4689]: E0307 04:24:26.298153 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1\": container with ID starting with 5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1 not found: ID does not exist" containerID="5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.298275 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1"} err="failed to get container status \"5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1\": rpc error: code = NotFound desc = could not find container \"5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1\": container with ID starting with 5cdb514cc1185efdb9a996fe2a64ab79b7c593689b1a771866e97c5fcf138ca1 not found: ID does not exist" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.355476 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-serving-cert\") pod \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.355529 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwr4f\" (UniqueName: \"kubernetes.io/projected/d730698e-8fcf-4823-91a4-580e9fbbf6bd-kube-api-access-vwr4f\") pod \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.355580 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-proxy-ca-bundles\") pod \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.355605 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d730698e-8fcf-4823-91a4-580e9fbbf6bd-serving-cert\") pod \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.355650 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xm26\" (UniqueName: \"kubernetes.io/projected/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-kube-api-access-5xm26\") pod \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.355691 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-config\") pod \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.355724 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-client-ca\") pod \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\" (UID: \"07e119d7-d5d5-4f2c-b4fa-4f0563498a57\") " Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.355744 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-client-ca\") pod \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.355791 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-config\") pod \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\" (UID: \"d730698e-8fcf-4823-91a4-580e9fbbf6bd\") " Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.356528 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-client-ca" (OuterVolumeSpecName: "client-ca") pod "07e119d7-d5d5-4f2c-b4fa-4f0563498a57" (UID: "07e119d7-d5d5-4f2c-b4fa-4f0563498a57"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.356941 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-client-ca" (OuterVolumeSpecName: "client-ca") pod "d730698e-8fcf-4823-91a4-580e9fbbf6bd" (UID: "d730698e-8fcf-4823-91a4-580e9fbbf6bd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.356983 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-config" (OuterVolumeSpecName: "config") pod "07e119d7-d5d5-4f2c-b4fa-4f0563498a57" (UID: "07e119d7-d5d5-4f2c-b4fa-4f0563498a57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.357327 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-config" (OuterVolumeSpecName: "config") pod "d730698e-8fcf-4823-91a4-580e9fbbf6bd" (UID: "d730698e-8fcf-4823-91a4-580e9fbbf6bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.357339 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d730698e-8fcf-4823-91a4-580e9fbbf6bd" (UID: "d730698e-8fcf-4823-91a4-580e9fbbf6bd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.360851 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d730698e-8fcf-4823-91a4-580e9fbbf6bd-kube-api-access-vwr4f" (OuterVolumeSpecName: "kube-api-access-vwr4f") pod "d730698e-8fcf-4823-91a4-580e9fbbf6bd" (UID: "d730698e-8fcf-4823-91a4-580e9fbbf6bd"). InnerVolumeSpecName "kube-api-access-vwr4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.361035 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d730698e-8fcf-4823-91a4-580e9fbbf6bd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d730698e-8fcf-4823-91a4-580e9fbbf6bd" (UID: "d730698e-8fcf-4823-91a4-580e9fbbf6bd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.362231 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "07e119d7-d5d5-4f2c-b4fa-4f0563498a57" (UID: "07e119d7-d5d5-4f2c-b4fa-4f0563498a57"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.362458 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-kube-api-access-5xm26" (OuterVolumeSpecName: "kube-api-access-5xm26") pod "07e119d7-d5d5-4f2c-b4fa-4f0563498a57" (UID: "07e119d7-d5d5-4f2c-b4fa-4f0563498a57"). InnerVolumeSpecName "kube-api-access-5xm26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.456925 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.456955 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.456967 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwr4f\" (UniqueName: \"kubernetes.io/projected/d730698e-8fcf-4823-91a4-580e9fbbf6bd-kube-api-access-vwr4f\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.456978 4689 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.456986 4689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d730698e-8fcf-4823-91a4-580e9fbbf6bd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.456995 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xm26\" (UniqueName: \"kubernetes.io/projected/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-kube-api-access-5xm26\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.457025 4689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.457033 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07e119d7-d5d5-4f2c-b4fa-4f0563498a57-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.457041 4689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d730698e-8fcf-4823-91a4-580e9fbbf6bd-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.587949 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8"] Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.596672 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5754d5c7-fzdv8"] Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.606683 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-866fb5bc59-g9s9z"] Mar 07 04:24:26 crc kubenswrapper[4689]: I0307 04:24:26.610321 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-866fb5bc59-g9s9z"] Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.202126 4689 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.202462 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136" gracePeriod=15 Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.202495 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10" gracePeriod=15 Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.202615 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643" gracePeriod=15 Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.202613 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27" gracePeriod=15 Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.202599 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555" gracePeriod=15 Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204349 4689 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204663 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" containerName="extract-content" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204677 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" containerName="extract-content" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204692 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a365d2-d74f-4675-b789-27bafa93fbff" containerName="extract-utilities" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204700 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a365d2-d74f-4675-b789-27bafa93fbff" containerName="extract-utilities" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204711 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" containerName="registry-server" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204719 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" containerName="registry-server" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204729 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a365d2-d74f-4675-b789-27bafa93fbff" containerName="registry-server" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204737 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a365d2-d74f-4675-b789-27bafa93fbff" containerName="registry-server" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204748 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a365d2-d74f-4675-b789-27bafa93fbff" containerName="extract-content" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204755 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a365d2-d74f-4675-b789-27bafa93fbff" containerName="extract-content" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204763 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204770 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204778 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" containerName="extract-utilities" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204785 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" containerName="extract-utilities" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204794 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204801 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204809 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18622abe-0dae-4a1b-83b8-8314bf342ccc" containerName="registry-server" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204815 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="18622abe-0dae-4a1b-83b8-8314bf342ccc" containerName="registry-server" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204827 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e119d7-d5d5-4f2c-b4fa-4f0563498a57" containerName="route-controller-manager" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204834 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e119d7-d5d5-4f2c-b4fa-4f0563498a57" containerName="route-controller-manager" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204844 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bbfad4-6baf-4ada-88b8-158f49957da5" containerName="extract-content" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204852 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bbfad4-6baf-4ada-88b8-158f49957da5" containerName="extract-content" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204866 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bbfad4-6baf-4ada-88b8-158f49957da5" containerName="extract-utilities" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204873 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bbfad4-6baf-4ada-88b8-158f49957da5" containerName="extract-utilities" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204883 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204890 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204903 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18622abe-0dae-4a1b-83b8-8314bf342ccc" containerName="extract-utilities" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204910 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="18622abe-0dae-4a1b-83b8-8314bf342ccc" containerName="extract-utilities" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204922 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204930 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204939 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204946 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204956 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204963 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204972 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18622abe-0dae-4a1b-83b8-8314bf342ccc" containerName="extract-content" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204979 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="18622abe-0dae-4a1b-83b8-8314bf342ccc" containerName="extract-content" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.204991 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bbfad4-6baf-4ada-88b8-158f49957da5" containerName="registry-server" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.204998 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bbfad4-6baf-4ada-88b8-158f49957da5" containerName="registry-server" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.205009 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205017 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.205027 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205034 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.205043 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205050 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.205058 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d730698e-8fcf-4823-91a4-580e9fbbf6bd" containerName="controller-manager" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205066 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d730698e-8fcf-4823-91a4-580e9fbbf6bd" containerName="controller-manager" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205182 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bbfad4-6baf-4ada-88b8-158f49957da5" containerName="registry-server" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205196 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205205 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e119d7-d5d5-4f2c-b4fa-4f0563498a57" containerName="route-controller-manager" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205214 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205221 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205229 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d730698e-8fcf-4823-91a4-580e9fbbf6bd" containerName="controller-manager" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205239 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205246 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205256 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a365d2-d74f-4675-b789-27bafa93fbff" containerName="registry-server" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205265 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205275 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="18622abe-0dae-4a1b-83b8-8314bf342ccc" containerName="registry-server" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205286 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205297 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="b84afefb-ca8f-4586-a7bc-6d733cb723b1" containerName="registry-server" Mar 07 04:24:27 crc kubenswrapper[4689]: E0307 04:24:27.205416 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205426 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205533 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.205552 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.213948 4689 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.215639 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.221952 4689 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.366670 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.366743 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.366874 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.366929 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.366981 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.367052 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.367107 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.367273 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.468936 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469047 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469072 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469112 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469150 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469197 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469194 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469239 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469250 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469210 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469241 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469256 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469206 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469226 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469386 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.469537 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.841251 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07e119d7-d5d5-4f2c-b4fa-4f0563498a57" path="/var/lib/kubelet/pods/07e119d7-d5d5-4f2c-b4fa-4f0563498a57/volumes" Mar 07 04:24:27 crc kubenswrapper[4689]: I0307 04:24:27.844586 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d730698e-8fcf-4823-91a4-580e9fbbf6bd" path="/var/lib/kubelet/pods/d730698e-8fcf-4823-91a4-580e9fbbf6bd/volumes" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.142375 4689 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.142836 4689 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.143403 4689 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.143948 4689 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.144589 4689 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:28 crc kubenswrapper[4689]: I0307 04:24:28.144683 4689 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.145278 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Mar 07 04:24:28 crc kubenswrapper[4689]: I0307 04:24:28.289349 4689 generic.go:334] "Generic (PLEG): container finished" podID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" containerID="ed21409e18f5a88176a2be6c9001166797e80fd96201392fda2a091521f633fa" exitCode=0 Mar 07 04:24:28 crc kubenswrapper[4689]: I0307 04:24:28.289375 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6","Type":"ContainerDied","Data":"ed21409e18f5a88176a2be6c9001166797e80fd96201392fda2a091521f633fa"} Mar 07 04:24:28 crc kubenswrapper[4689]: I0307 04:24:28.290534 4689 status_manager.go:851] "Failed to get status for pod" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:28 crc kubenswrapper[4689]: I0307 04:24:28.292242 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 04:24:28 crc kubenswrapper[4689]: I0307 04:24:28.293731 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 04:24:28 crc kubenswrapper[4689]: I0307 04:24:28.294559 4689 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136" exitCode=0 Mar 07 04:24:28 crc kubenswrapper[4689]: I0307 04:24:28.294597 4689 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555" exitCode=0 Mar 07 04:24:28 crc kubenswrapper[4689]: I0307 04:24:28.294618 4689 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10" exitCode=0 Mar 07 04:24:28 crc kubenswrapper[4689]: I0307 04:24:28.294632 4689 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27" exitCode=2 Mar 07 04:24:28 crc kubenswrapper[4689]: I0307 04:24:28.294700 4689 scope.go:117] "RemoveContainer" containerID="504e9da03d2dce361f7791b4ed981ad15f7da6905b21d7776e385d4586fd2301" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.346268 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.346208 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:24:28Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:24:28Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:24:28Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:24:28Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:3a51a9700f70a0f3441e6c87e61adc96221ee634d131686cfbdda6a0ab029e66\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:90928686b004358b08fd7393a22c7c20a8c5fd972624fd815b85cbbf10ee98a6\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220120275},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.347255 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.348087 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.348694 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.349216 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.349262 4689 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 04:24:28 crc kubenswrapper[4689]: E0307 04:24:28.747535 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.313282 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 04:24:29 crc kubenswrapper[4689]: E0307 04:24:29.555690 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.697467 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.697968 4689 status_manager.go:851] "Failed to get status for pod" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.698585 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.699114 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.699474 4689 status_manager.go:851] "Failed to get status for pod" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.699815 4689 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.815154 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-kubelet-dir\") pod \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\" (UID: \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\") " Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.815343 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.815374 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" (UID: "40d78e2e-6dbe-47ff-9db0-79bd0057c7d6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.815438 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-kube-api-access\") pod \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\" (UID: \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\") " Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.815463 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.815682 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.815822 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-var-lock\") pod \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\" (UID: \"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6\") " Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.815870 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.815911 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-var-lock" (OuterVolumeSpecName: "var-lock") pod "40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" (UID: "40d78e2e-6dbe-47ff-9db0-79bd0057c7d6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.816069 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.816241 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.816654 4689 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.816746 4689 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-var-lock\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.816769 4689 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.816788 4689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.816812 4689 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.824429 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" (UID: "40d78e2e-6dbe-47ff-9db0-79bd0057c7d6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.839547 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 07 04:24:29 crc kubenswrapper[4689]: I0307 04:24:29.917848 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40d78e2e-6dbe-47ff-9db0-79bd0057c7d6-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.336506 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.338259 4689 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643" exitCode=0 Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.338365 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.338421 4689 scope.go:117] "RemoveContainer" containerID="7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.339395 4689 status_manager.go:851] "Failed to get status for pod" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.340441 4689 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.341681 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"40d78e2e-6dbe-47ff-9db0-79bd0057c7d6","Type":"ContainerDied","Data":"c42538de8beef5bb2a23bcc351e457ff150af13c057f04d6e3994eb608c90374"} Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.341769 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.341793 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c42538de8beef5bb2a23bcc351e457ff150af13c057f04d6e3994eb608c90374" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.343852 4689 status_manager.go:851] "Failed to get status for pod" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.344439 4689 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.351824 4689 status_manager.go:851] "Failed to get status for pod" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.352368 4689 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.361996 4689 scope.go:117] "RemoveContainer" containerID="1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.389225 4689 scope.go:117] "RemoveContainer" containerID="3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.409366 4689 scope.go:117] "RemoveContainer" containerID="789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.428678 4689 scope.go:117] "RemoveContainer" containerID="921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.449419 4689 scope.go:117] "RemoveContainer" containerID="07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.516867 4689 scope.go:117] "RemoveContainer" containerID="7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136" Mar 07 04:24:30 crc kubenswrapper[4689]: E0307 04:24:30.517821 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\": container with ID starting with 7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136 not found: ID does not exist" containerID="7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.517906 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136"} err="failed to get container status \"7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\": rpc error: code = NotFound desc = could not find container \"7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136\": container with ID starting with 7754d23e308beab8ec59e82eee919d0efd721f029c4b2804b21c84d771756136 not found: ID does not exist" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.517945 4689 scope.go:117] "RemoveContainer" containerID="1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555" Mar 07 04:24:30 crc kubenswrapper[4689]: E0307 04:24:30.518387 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\": container with ID starting with 1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555 not found: ID does not exist" containerID="1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.518457 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555"} err="failed to get container status \"1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\": rpc error: code = NotFound desc = could not find container \"1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555\": container with ID starting with 1a36d8dd9d855634b850be401bea0e170c3ef90e92c355380ebdccc74862c555 not found: ID does not exist" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.518507 4689 scope.go:117] "RemoveContainer" containerID="3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10" Mar 07 04:24:30 crc kubenswrapper[4689]: E0307 04:24:30.518931 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\": container with ID starting with 3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10 not found: ID does not exist" containerID="3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.518967 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10"} err="failed to get container status \"3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\": rpc error: code = NotFound desc = could not find container \"3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10\": container with ID starting with 3ac109530bc3a37253325cdd13e2d3b29c431bd0143c46897e46d6bfffed1b10 not found: ID does not exist" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.519027 4689 scope.go:117] "RemoveContainer" containerID="789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27" Mar 07 04:24:30 crc kubenswrapper[4689]: E0307 04:24:30.519388 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\": container with ID starting with 789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27 not found: ID does not exist" containerID="789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.519511 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27"} err="failed to get container status \"789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\": rpc error: code = NotFound desc = could not find container \"789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27\": container with ID starting with 789560341b34a39af312e6519c631c0508184334b3fc5332acce86b7cb901c27 not found: ID does not exist" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.519597 4689 scope.go:117] "RemoveContainer" containerID="921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643" Mar 07 04:24:30 crc kubenswrapper[4689]: E0307 04:24:30.519908 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\": container with ID starting with 921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643 not found: ID does not exist" containerID="921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.519989 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643"} err="failed to get container status \"921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\": rpc error: code = NotFound desc = could not find container \"921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643\": container with ID starting with 921e3a5e46b2181530be6acce53ea28aa8d10001824b76684c0737df3ecbf643 not found: ID does not exist" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.520057 4689 scope.go:117] "RemoveContainer" containerID="07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6" Mar 07 04:24:30 crc kubenswrapper[4689]: E0307 04:24:30.520930 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\": container with ID starting with 07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6 not found: ID does not exist" containerID="07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6" Mar 07 04:24:30 crc kubenswrapper[4689]: I0307 04:24:30.520961 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6"} err="failed to get container status \"07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\": rpc error: code = NotFound desc = could not find container \"07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6\": container with ID starting with 07f4660a0bd2699390fb6a56c2b05aaf6d30c292dc03ab38ceab531425cba8e6 not found: ID does not exist" Mar 07 04:24:31 crc kubenswrapper[4689]: E0307 04:24:31.158257 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Mar 07 04:24:32 crc kubenswrapper[4689]: E0307 04:24:32.260886 4689 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:32 crc kubenswrapper[4689]: I0307 04:24:32.261368 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:32 crc kubenswrapper[4689]: W0307 04:24:32.301119 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6ccacaf10b2db4caa17710a9f1219c7a84444924ec3b4aea024fa2e2b619529c WatchSource:0}: Error finding container 6ccacaf10b2db4caa17710a9f1219c7a84444924ec3b4aea024fa2e2b619529c: Status 404 returned error can't find the container with id 6ccacaf10b2db4caa17710a9f1219c7a84444924ec3b4aea024fa2e2b619529c Mar 07 04:24:32 crc kubenswrapper[4689]: E0307 04:24:32.307469 4689 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189a747a62370d85 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:24:32.306318725 +0000 UTC m=+317.352702224,LastTimestamp:2026-03-07 04:24:32.306318725 +0000 UTC m=+317.352702224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:24:32 crc kubenswrapper[4689]: I0307 04:24:32.369146 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6ccacaf10b2db4caa17710a9f1219c7a84444924ec3b4aea024fa2e2b619529c"} Mar 07 04:24:33 crc kubenswrapper[4689]: I0307 04:24:33.375013 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b209f86d0eb63aa69ca20da8da46cac738229b9dbb0c6e6e9671607dab93d1f7"} Mar 07 04:24:33 crc kubenswrapper[4689]: I0307 04:24:33.375685 4689 status_manager.go:851] "Failed to get status for pod" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:33 crc kubenswrapper[4689]: E0307 04:24:33.375717 4689 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:34 crc kubenswrapper[4689]: E0307 04:24:34.359451 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="6.4s" Mar 07 04:24:34 crc kubenswrapper[4689]: E0307 04:24:34.382327 4689 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:24:35 crc kubenswrapper[4689]: I0307 04:24:35.831109 4689 status_manager.go:851] "Failed to get status for pod" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:37 crc kubenswrapper[4689]: E0307 04:24:37.923434 4689 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189a747a62370d85 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 04:24:32.306318725 +0000 UTC m=+317.352702224,LastTimestamp:2026-03-07 04:24:32.306318725 +0000 UTC m=+317.352702224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 04:24:38 crc kubenswrapper[4689]: E0307 04:24:38.619072 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:24:38Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:24:38Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:24:38Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T04:24:38Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:3a51a9700f70a0f3441e6c87e61adc96221ee634d131686cfbdda6a0ab029e66\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:90928686b004358b08fd7393a22c7c20a8c5fd972624fd815b85cbbf10ee98a6\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220120275},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:38 crc kubenswrapper[4689]: E0307 04:24:38.619616 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:38 crc kubenswrapper[4689]: E0307 04:24:38.620072 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:38 crc kubenswrapper[4689]: E0307 04:24:38.620951 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:38 crc kubenswrapper[4689]: E0307 04:24:38.621387 4689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:38 crc kubenswrapper[4689]: E0307 04:24:38.621422 4689 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 04:24:39 crc kubenswrapper[4689]: I0307 04:24:39.825004 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:39 crc kubenswrapper[4689]: I0307 04:24:39.826305 4689 status_manager.go:851] "Failed to get status for pod" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:39 crc kubenswrapper[4689]: I0307 04:24:39.852470 4689 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c62d0d3-38fb-407a-89b0-9ba3a380c851" Mar 07 04:24:39 crc kubenswrapper[4689]: I0307 04:24:39.852539 4689 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c62d0d3-38fb-407a-89b0-9ba3a380c851" Mar 07 04:24:39 crc kubenswrapper[4689]: E0307 04:24:39.854557 4689 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:39 crc kubenswrapper[4689]: I0307 04:24:39.855437 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:40 crc kubenswrapper[4689]: I0307 04:24:40.438980 4689 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="05f0aa9149adc09775a3b55f13327a9248031c112d15f46f2546c104f35ecb09" exitCode=0 Mar 07 04:24:40 crc kubenswrapper[4689]: I0307 04:24:40.439115 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"05f0aa9149adc09775a3b55f13327a9248031c112d15f46f2546c104f35ecb09"} Mar 07 04:24:40 crc kubenswrapper[4689]: I0307 04:24:40.439406 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6b7598922cd682be0b2900ffc04e65c5eb90e9486e537f07bc2fc4e7740927f0"} Mar 07 04:24:40 crc kubenswrapper[4689]: I0307 04:24:40.439815 4689 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c62d0d3-38fb-407a-89b0-9ba3a380c851" Mar 07 04:24:40 crc kubenswrapper[4689]: I0307 04:24:40.439847 4689 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c62d0d3-38fb-407a-89b0-9ba3a380c851" Mar 07 04:24:40 crc kubenswrapper[4689]: I0307 04:24:40.440552 4689 status_manager.go:851] "Failed to get status for pod" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Mar 07 04:24:40 crc kubenswrapper[4689]: E0307 04:24:40.440575 4689 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:40 crc kubenswrapper[4689]: E0307 04:24:40.761329 4689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="7s" Mar 07 04:24:41 crc kubenswrapper[4689]: I0307 04:24:41.455466 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"13a509a2b01853ae6388986a57d18b71df39d66c06fa1bd8fdab720971536556"} Mar 07 04:24:41 crc kubenswrapper[4689]: I0307 04:24:41.455534 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e448ac976b5ac7e8eaa52919d2d4f90e59917f4df19ee533370c8cd5929b79f1"} Mar 07 04:24:41 crc kubenswrapper[4689]: I0307 04:24:41.463379 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 07 04:24:41 crc kubenswrapper[4689]: I0307 04:24:41.464226 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 07 04:24:41 crc kubenswrapper[4689]: I0307 04:24:41.464287 4689 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765" exitCode=1 Mar 07 04:24:41 crc kubenswrapper[4689]: I0307 04:24:41.464330 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765"} Mar 07 04:24:41 crc kubenswrapper[4689]: I0307 04:24:41.464835 4689 scope.go:117] "RemoveContainer" containerID="293ffad9788a2bdc2982b9e7bdeeb0168011eafccf385fcc70db42d84bb51765" Mar 07 04:24:42 crc kubenswrapper[4689]: I0307 04:24:42.165878 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:24:42 crc kubenswrapper[4689]: I0307 04:24:42.474191 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b092916422d022a0e00b49b4acae91b26791b7ea2e2c6ef9b195772646dd4b1d"} Mar 07 04:24:42 crc kubenswrapper[4689]: I0307 04:24:42.474269 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"aab5c5c8a1f8fa2817d9e69edcdab0d6982f9d53849307ee90cbd3e3030d06b2"} Mar 07 04:24:42 crc kubenswrapper[4689]: I0307 04:24:42.474283 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f49a18dbb2727fe131032de4dfe80e7ab4ce471f26bdbd9e87dd8e44357aaf9e"} Mar 07 04:24:42 crc kubenswrapper[4689]: I0307 04:24:42.474401 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:42 crc kubenswrapper[4689]: I0307 04:24:42.474524 4689 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c62d0d3-38fb-407a-89b0-9ba3a380c851" Mar 07 04:24:42 crc kubenswrapper[4689]: I0307 04:24:42.474550 4689 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c62d0d3-38fb-407a-89b0-9ba3a380c851" Mar 07 04:24:42 crc kubenswrapper[4689]: I0307 04:24:42.477272 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 07 04:24:42 crc kubenswrapper[4689]: I0307 04:24:42.477770 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 07 04:24:42 crc kubenswrapper[4689]: I0307 04:24:42.477813 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a21a93a3015d1064103873ae0acb9eeab4f174c783e2ed68503acd98e8d0ffd3"} Mar 07 04:24:43 crc kubenswrapper[4689]: I0307 04:24:43.750801 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:24:43 crc kubenswrapper[4689]: I0307 04:24:43.769104 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:24:44 crc kubenswrapper[4689]: I0307 04:24:44.491101 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:24:44 crc kubenswrapper[4689]: I0307 04:24:44.856072 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:44 crc kubenswrapper[4689]: I0307 04:24:44.856141 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:44 crc kubenswrapper[4689]: I0307 04:24:44.865478 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:47 crc kubenswrapper[4689]: I0307 04:24:47.486589 4689 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:47 crc kubenswrapper[4689]: I0307 04:24:47.506087 4689 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c62d0d3-38fb-407a-89b0-9ba3a380c851" Mar 07 04:24:47 crc kubenswrapper[4689]: I0307 04:24:47.506116 4689 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c62d0d3-38fb-407a-89b0-9ba3a380c851" Mar 07 04:24:47 crc kubenswrapper[4689]: I0307 04:24:47.522836 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:24:47 crc kubenswrapper[4689]: I0307 04:24:47.544681 4689 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="47e7caa9-619d-4cb2-bafa-0c03b122ff78" Mar 07 04:24:48 crc kubenswrapper[4689]: I0307 04:24:48.512872 4689 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c62d0d3-38fb-407a-89b0-9ba3a380c851" Mar 07 04:24:48 crc kubenswrapper[4689]: I0307 04:24:48.512925 4689 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c62d0d3-38fb-407a-89b0-9ba3a380c851" Mar 07 04:24:48 crc kubenswrapper[4689]: I0307 04:24:48.517069 4689 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="47e7caa9-619d-4cb2-bafa-0c03b122ff78" Mar 07 04:24:52 crc kubenswrapper[4689]: I0307 04:24:52.172283 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 04:24:56 crc kubenswrapper[4689]: I0307 04:24:56.872199 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 07 04:24:57 crc kubenswrapper[4689]: I0307 04:24:57.650199 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 07 04:24:58 crc kubenswrapper[4689]: I0307 04:24:58.186235 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 07 04:24:59 crc kubenswrapper[4689]: I0307 04:24:59.122544 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 07 04:24:59 crc kubenswrapper[4689]: I0307 04:24:59.304698 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 07 04:24:59 crc kubenswrapper[4689]: I0307 04:24:59.312282 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 07 04:24:59 crc kubenswrapper[4689]: I0307 04:24:59.423132 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 07 04:24:59 crc kubenswrapper[4689]: I0307 04:24:59.648372 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 07 04:25:00 crc kubenswrapper[4689]: I0307 04:25:00.028301 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 07 04:25:00 crc kubenswrapper[4689]: I0307 04:25:00.118023 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 07 04:25:00 crc kubenswrapper[4689]: I0307 04:25:00.147388 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 07 04:25:00 crc kubenswrapper[4689]: I0307 04:25:00.369497 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 07 04:25:00 crc kubenswrapper[4689]: I0307 04:25:00.430804 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 07 04:25:00 crc kubenswrapper[4689]: I0307 04:25:00.568026 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 07 04:25:00 crc kubenswrapper[4689]: I0307 04:25:00.663115 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 07 04:25:00 crc kubenswrapper[4689]: I0307 04:25:00.775930 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 07 04:25:00 crc kubenswrapper[4689]: I0307 04:25:00.808300 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 07 04:25:00 crc kubenswrapper[4689]: I0307 04:25:00.990037 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.113221 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.198586 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.229329 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.244923 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.304696 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.305305 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.437015 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.465350 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.470760 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.501564 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.574681 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.711077 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.735765 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.749311 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.779590 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 07 04:25:01 crc kubenswrapper[4689]: I0307 04:25:01.993635 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.026107 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.190750 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.197959 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.221979 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.231552 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.233728 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.264935 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.400876 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.413616 4689 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.472752 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.586662 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.594991 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.659327 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.689366 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 07 04:25:02 crc kubenswrapper[4689]: I0307 04:25:02.932684 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.093878 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.215341 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.232631 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.278273 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.351544 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.351768 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.370712 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.391913 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.431816 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.477373 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.484840 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.515065 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.567036 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.636314 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.642253 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.671301 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.704997 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.809420 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.852703 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.882568 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.891294 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 07 04:25:03 crc kubenswrapper[4689]: I0307 04:25:03.919046 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.031220 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.216986 4689 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.246471 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.332541 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.403690 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.588869 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.599384 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.689619 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.695863 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.726670 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.803464 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.807665 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.810452 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.943266 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.955608 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.985033 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 07 04:25:04 crc kubenswrapper[4689]: I0307 04:25:04.999221 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.128374 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.135430 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.190395 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.193781 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.240082 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.330119 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.355223 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.373435 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.377967 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.462393 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.570093 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.614643 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.671001 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.754365 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.791760 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.810706 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.878082 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.932679 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.996276 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 07 04:25:05 crc kubenswrapper[4689]: I0307 04:25:05.999224 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.006860 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.101379 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.199482 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.262479 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.306293 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.335977 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.346965 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.358665 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.408528 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.415833 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.436335 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.438758 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.459808 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.569194 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.629069 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.658053 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.735851 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.772365 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.777121 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.778406 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.988058 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 07 04:25:06 crc kubenswrapper[4689]: I0307 04:25:06.988060 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.003952 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.020074 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.108486 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.317431 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.342576 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.355338 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.407621 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.515805 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.535776 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.658873 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.707399 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.722087 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.725955 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.730137 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.771500 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.807151 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.849519 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.921712 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.925812 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 07 04:25:07 crc kubenswrapper[4689]: I0307 04:25:07.953143 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 07 04:25:08 crc kubenswrapper[4689]: I0307 04:25:08.085685 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 07 04:25:08 crc kubenswrapper[4689]: I0307 04:25:08.120558 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 07 04:25:08 crc kubenswrapper[4689]: I0307 04:25:08.229314 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 07 04:25:08 crc kubenswrapper[4689]: I0307 04:25:08.332783 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 07 04:25:08 crc kubenswrapper[4689]: I0307 04:25:08.474236 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 07 04:25:08 crc kubenswrapper[4689]: I0307 04:25:08.596140 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 07 04:25:08 crc kubenswrapper[4689]: I0307 04:25:08.616833 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 07 04:25:08 crc kubenswrapper[4689]: I0307 04:25:08.701810 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 07 04:25:08 crc kubenswrapper[4689]: I0307 04:25:08.705104 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 07 04:25:08 crc kubenswrapper[4689]: I0307 04:25:08.718237 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 07 04:25:08 crc kubenswrapper[4689]: I0307 04:25:08.959499 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.036245 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.076307 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.202802 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.211336 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.214227 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.214704 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.240855 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.271456 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.284668 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.313581 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.325276 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.405051 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.588339 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.613450 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.693288 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.717519 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.736354 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.792677 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 07 04:25:09 crc kubenswrapper[4689]: I0307 04:25:09.957638 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.009243 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.061232 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.116922 4689 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.191809 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.237554 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.268601 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.338358 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.369640 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.387950 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.420494 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.446766 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.471343 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.486536 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.500941 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.598631 4689 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.604111 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.604189 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69c77f697f-hvgcv","openshift-kube-apiserver/kube-apiserver-crc","openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl"] Mar 07 04:25:10 crc kubenswrapper[4689]: E0307 04:25:10.604418 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" containerName="installer" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.604448 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" containerName="installer" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.604586 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="40d78e2e-6dbe-47ff-9db0-79bd0057c7d6" containerName="installer" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.605665 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.607743 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.608343 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.608346 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.609012 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.609767 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.610093 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.608605 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.608671 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.610764 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.611630 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.612049 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.612372 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.612647 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.613114 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.620506 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.638070 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.658479 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.658461485 podStartE2EDuration="23.658461485s" podCreationTimestamp="2026-03-07 04:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:25:10.648428351 +0000 UTC m=+355.694811840" watchObservedRunningTime="2026-03-07 04:25:10.658461485 +0000 UTC m=+355.704844984" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.720761 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.723884 4689 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.750368 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e52b73-2b36-4d58-ab3b-30a236d74f8c-client-ca\") pod \"route-controller-manager-69bf75486f-m95jl\" (UID: \"e0e52b73-2b36-4d58-ab3b-30a236d74f8c\") " pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.750420 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e52b73-2b36-4d58-ab3b-30a236d74f8c-serving-cert\") pod \"route-controller-manager-69bf75486f-m95jl\" (UID: \"e0e52b73-2b36-4d58-ab3b-30a236d74f8c\") " pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.750448 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45srp\" (UniqueName: \"kubernetes.io/projected/876cf8b2-ab06-4969-b379-d4a409d5856c-kube-api-access-45srp\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.750481 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxr86\" (UniqueName: \"kubernetes.io/projected/e0e52b73-2b36-4d58-ab3b-30a236d74f8c-kube-api-access-qxr86\") pod \"route-controller-manager-69bf75486f-m95jl\" (UID: \"e0e52b73-2b36-4d58-ab3b-30a236d74f8c\") " pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.750509 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/876cf8b2-ab06-4969-b379-d4a409d5856c-proxy-ca-bundles\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.750533 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/876cf8b2-ab06-4969-b379-d4a409d5856c-client-ca\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.750571 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e52b73-2b36-4d58-ab3b-30a236d74f8c-config\") pod \"route-controller-manager-69bf75486f-m95jl\" (UID: \"e0e52b73-2b36-4d58-ab3b-30a236d74f8c\") " pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.750605 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876cf8b2-ab06-4969-b379-d4a409d5856c-config\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.750638 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/876cf8b2-ab06-4969-b379-d4a409d5856c-serving-cert\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.752728 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.829453 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.850728 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.850970 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e52b73-2b36-4d58-ab3b-30a236d74f8c-client-ca\") pod \"route-controller-manager-69bf75486f-m95jl\" (UID: \"e0e52b73-2b36-4d58-ab3b-30a236d74f8c\") " pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.851028 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e52b73-2b36-4d58-ab3b-30a236d74f8c-serving-cert\") pod \"route-controller-manager-69bf75486f-m95jl\" (UID: \"e0e52b73-2b36-4d58-ab3b-30a236d74f8c\") " pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.851056 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45srp\" (UniqueName: \"kubernetes.io/projected/876cf8b2-ab06-4969-b379-d4a409d5856c-kube-api-access-45srp\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.851099 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxr86\" (UniqueName: \"kubernetes.io/projected/e0e52b73-2b36-4d58-ab3b-30a236d74f8c-kube-api-access-qxr86\") pod \"route-controller-manager-69bf75486f-m95jl\" (UID: \"e0e52b73-2b36-4d58-ab3b-30a236d74f8c\") " pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.851126 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/876cf8b2-ab06-4969-b379-d4a409d5856c-proxy-ca-bundles\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.851144 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/876cf8b2-ab06-4969-b379-d4a409d5856c-client-ca\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.851187 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e52b73-2b36-4d58-ab3b-30a236d74f8c-config\") pod \"route-controller-manager-69bf75486f-m95jl\" (UID: \"e0e52b73-2b36-4d58-ab3b-30a236d74f8c\") " pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.851265 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876cf8b2-ab06-4969-b379-d4a409d5856c-config\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.851290 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/876cf8b2-ab06-4969-b379-d4a409d5856c-serving-cert\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.851901 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e52b73-2b36-4d58-ab3b-30a236d74f8c-client-ca\") pod \"route-controller-manager-69bf75486f-m95jl\" (UID: \"e0e52b73-2b36-4d58-ab3b-30a236d74f8c\") " pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.852586 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/876cf8b2-ab06-4969-b379-d4a409d5856c-proxy-ca-bundles\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.852969 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876cf8b2-ab06-4969-b379-d4a409d5856c-config\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.852985 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/876cf8b2-ab06-4969-b379-d4a409d5856c-client-ca\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.853631 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e52b73-2b36-4d58-ab3b-30a236d74f8c-config\") pod \"route-controller-manager-69bf75486f-m95jl\" (UID: \"e0e52b73-2b36-4d58-ab3b-30a236d74f8c\") " pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.857687 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e52b73-2b36-4d58-ab3b-30a236d74f8c-serving-cert\") pod \"route-controller-manager-69bf75486f-m95jl\" (UID: \"e0e52b73-2b36-4d58-ab3b-30a236d74f8c\") " pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.857689 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/876cf8b2-ab06-4969-b379-d4a409d5856c-serving-cert\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.877844 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45srp\" (UniqueName: \"kubernetes.io/projected/876cf8b2-ab06-4969-b379-d4a409d5856c-kube-api-access-45srp\") pod \"controller-manager-69c77f697f-hvgcv\" (UID: \"876cf8b2-ab06-4969-b379-d4a409d5856c\") " pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.877964 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxr86\" (UniqueName: \"kubernetes.io/projected/e0e52b73-2b36-4d58-ab3b-30a236d74f8c-kube-api-access-qxr86\") pod \"route-controller-manager-69bf75486f-m95jl\" (UID: \"e0e52b73-2b36-4d58-ab3b-30a236d74f8c\") " pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.936074 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.943358 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 07 04:25:10 crc kubenswrapper[4689]: I0307 04:25:10.951958 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.075873 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.239101 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.242352 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.343846 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.353679 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.369567 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl"] Mar 07 04:25:11 crc kubenswrapper[4689]: W0307 04:25:11.378014 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0e52b73_2b36_4d58_ab3b_30a236d74f8c.slice/crio-e0730db8b26205be52e5551bbb2ab12063677b62153be0e00bfaf48ddd20b478 WatchSource:0}: Error finding container e0730db8b26205be52e5551bbb2ab12063677b62153be0e00bfaf48ddd20b478: Status 404 returned error can't find the container with id e0730db8b26205be52e5551bbb2ab12063677b62153be0e00bfaf48ddd20b478 Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.422491 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69c77f697f-hvgcv"] Mar 07 04:25:11 crc kubenswrapper[4689]: W0307 04:25:11.423794 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod876cf8b2_ab06_4969_b379_d4a409d5856c.slice/crio-01e90d11bc938de56b6a02a4910065b03de610fdbe772b0392b8913d2d6917c1 WatchSource:0}: Error finding container 01e90d11bc938de56b6a02a4910065b03de610fdbe772b0392b8913d2d6917c1: Status 404 returned error can't find the container with id 01e90d11bc938de56b6a02a4910065b03de610fdbe772b0392b8913d2d6917c1 Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.508001 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.522251 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.601454 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.655993 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" event={"ID":"876cf8b2-ab06-4969-b379-d4a409d5856c","Type":"ContainerStarted","Data":"b716704150a008f7983207e09c955b2b322332186b0c659fada28ce2cd51d382"} Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.656043 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" event={"ID":"876cf8b2-ab06-4969-b379-d4a409d5856c","Type":"ContainerStarted","Data":"01e90d11bc938de56b6a02a4910065b03de610fdbe772b0392b8913d2d6917c1"} Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.656266 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.657789 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" event={"ID":"e0e52b73-2b36-4d58-ab3b-30a236d74f8c","Type":"ContainerStarted","Data":"47bb386830accdd82ce9c79457b65b9df797c781f90224ec9edf9ad443b1420d"} Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.657829 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" event={"ID":"e0e52b73-2b36-4d58-ab3b-30a236d74f8c","Type":"ContainerStarted","Data":"e0730db8b26205be52e5551bbb2ab12063677b62153be0e00bfaf48ddd20b478"} Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.657942 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.661154 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.669936 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.677525 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69c77f697f-hvgcv" podStartSLOduration=46.677505928 podStartE2EDuration="46.677505928s" podCreationTimestamp="2026-03-07 04:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:25:11.677313372 +0000 UTC m=+356.723696861" watchObservedRunningTime="2026-03-07 04:25:11.677505928 +0000 UTC m=+356.723889417" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.708621 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" podStartSLOduration=46.708604228 podStartE2EDuration="46.708604228s" podCreationTimestamp="2026-03-07 04:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:25:11.695038927 +0000 UTC m=+356.741422416" watchObservedRunningTime="2026-03-07 04:25:11.708604228 +0000 UTC m=+356.754987717" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.712220 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.927386 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 07 04:25:11 crc kubenswrapper[4689]: I0307 04:25:11.994140 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.154766 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.155254 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.167917 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.203761 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.216361 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.311862 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.354307 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.361710 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69bf75486f-m95jl" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.369782 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.512211 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.558467 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.646487 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.737310 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.787159 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.840522 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 07 04:25:12 crc kubenswrapper[4689]: I0307 04:25:12.950386 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 07 04:25:13 crc kubenswrapper[4689]: I0307 04:25:13.143550 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 07 04:25:13 crc kubenswrapper[4689]: I0307 04:25:13.160226 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 07 04:25:13 crc kubenswrapper[4689]: I0307 04:25:13.229447 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 07 04:25:13 crc kubenswrapper[4689]: I0307 04:25:13.284114 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 07 04:25:13 crc kubenswrapper[4689]: I0307 04:25:13.434806 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 07 04:25:13 crc kubenswrapper[4689]: I0307 04:25:13.760651 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 07 04:25:14 crc kubenswrapper[4689]: I0307 04:25:14.047839 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 07 04:25:14 crc kubenswrapper[4689]: I0307 04:25:14.508695 4689 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 07 04:25:14 crc kubenswrapper[4689]: I0307 04:25:14.562605 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 07 04:25:14 crc kubenswrapper[4689]: I0307 04:25:14.622800 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.925580 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fmghp"] Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.926435 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fmghp" podUID="c82c3040-48ed-473b-9386-d58d13364f29" containerName="registry-server" containerID="cri-o://5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d" gracePeriod=30 Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.940206 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvrwc"] Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.940529 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hvrwc" podUID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" containerName="registry-server" containerID="cri-o://e688ca2a50484d2e27ff1b3acc6a67a90b679e4083c06eccffb6c2e49b5a4484" gracePeriod=30 Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.946525 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m4p5r"] Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.946897 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" podUID="76eea8f9-8567-496d-ac53-575a25a140de" containerName="marketplace-operator" containerID="cri-o://d04fe09a98945d979d230cec89cd839e6b709af2391b7e38db6d8a25f8aab189" gracePeriod=30 Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.955798 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgr9z"] Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.956081 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tgr9z" podUID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" containerName="registry-server" containerID="cri-o://eea2a417227a70af19d80116d0b67a59e65a9773db20ae2254f9852d6a6682bd" gracePeriod=30 Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.973294 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2wh2s"] Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.973532 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2wh2s" podUID="98a53e64-9323-454c-9de0-a8d348182a64" containerName="registry-server" containerID="cri-o://c5e1d6e56bca9d9d91c9496d08294c8759b8b4423f9fd884ba459cdde6a3fdbe" gracePeriod=30 Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.979418 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2qc7s"] Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.980227 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:18 crc kubenswrapper[4689]: I0307 04:25:18.985731 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2qc7s"] Mar 07 04:25:19 crc kubenswrapper[4689]: E0307 04:25:19.017464 4689 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc82c3040_48ed_473b_9386_d58d13364f29.slice/crio-conmon-5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d.scope\": RecentStats: unable to find data in memory cache]" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.060156 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/431626bb-08c9-4190-83e1-d4d5fd7cb198-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2qc7s\" (UID: \"431626bb-08c9-4190-83e1-d4d5fd7cb198\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.060260 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/431626bb-08c9-4190-83e1-d4d5fd7cb198-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2qc7s\" (UID: \"431626bb-08c9-4190-83e1-d4d5fd7cb198\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.060313 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nckkq\" (UniqueName: \"kubernetes.io/projected/431626bb-08c9-4190-83e1-d4d5fd7cb198-kube-api-access-nckkq\") pod \"marketplace-operator-79b997595-2qc7s\" (UID: \"431626bb-08c9-4190-83e1-d4d5fd7cb198\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.161317 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/431626bb-08c9-4190-83e1-d4d5fd7cb198-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2qc7s\" (UID: \"431626bb-08c9-4190-83e1-d4d5fd7cb198\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.161701 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/431626bb-08c9-4190-83e1-d4d5fd7cb198-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2qc7s\" (UID: \"431626bb-08c9-4190-83e1-d4d5fd7cb198\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.161744 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nckkq\" (UniqueName: \"kubernetes.io/projected/431626bb-08c9-4190-83e1-d4d5fd7cb198-kube-api-access-nckkq\") pod \"marketplace-operator-79b997595-2qc7s\" (UID: \"431626bb-08c9-4190-83e1-d4d5fd7cb198\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.162944 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/431626bb-08c9-4190-83e1-d4d5fd7cb198-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2qc7s\" (UID: \"431626bb-08c9-4190-83e1-d4d5fd7cb198\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.171468 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/431626bb-08c9-4190-83e1-d4d5fd7cb198-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2qc7s\" (UID: \"431626bb-08c9-4190-83e1-d4d5fd7cb198\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.179968 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nckkq\" (UniqueName: \"kubernetes.io/projected/431626bb-08c9-4190-83e1-d4d5fd7cb198-kube-api-access-nckkq\") pod \"marketplace-operator-79b997595-2qc7s\" (UID: \"431626bb-08c9-4190-83e1-d4d5fd7cb198\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.382410 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.387813 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.464716 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82c3040-48ed-473b-9386-d58d13364f29-utilities\") pod \"c82c3040-48ed-473b-9386-d58d13364f29\" (UID: \"c82c3040-48ed-473b-9386-d58d13364f29\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.464784 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82c3040-48ed-473b-9386-d58d13364f29-catalog-content\") pod \"c82c3040-48ed-473b-9386-d58d13364f29\" (UID: \"c82c3040-48ed-473b-9386-d58d13364f29\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.464845 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrct5\" (UniqueName: \"kubernetes.io/projected/c82c3040-48ed-473b-9386-d58d13364f29-kube-api-access-wrct5\") pod \"c82c3040-48ed-473b-9386-d58d13364f29\" (UID: \"c82c3040-48ed-473b-9386-d58d13364f29\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.466974 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c82c3040-48ed-473b-9386-d58d13364f29-utilities" (OuterVolumeSpecName: "utilities") pod "c82c3040-48ed-473b-9386-d58d13364f29" (UID: "c82c3040-48ed-473b-9386-d58d13364f29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.479777 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c82c3040-48ed-473b-9386-d58d13364f29-kube-api-access-wrct5" (OuterVolumeSpecName: "kube-api-access-wrct5") pod "c82c3040-48ed-473b-9386-d58d13364f29" (UID: "c82c3040-48ed-473b-9386-d58d13364f29"). InnerVolumeSpecName "kube-api-access-wrct5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.544684 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c82c3040-48ed-473b-9386-d58d13364f29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c82c3040-48ed-473b-9386-d58d13364f29" (UID: "c82c3040-48ed-473b-9386-d58d13364f29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.567940 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82c3040-48ed-473b-9386-d58d13364f29-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.568750 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82c3040-48ed-473b-9386-d58d13364f29-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.568799 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrct5\" (UniqueName: \"kubernetes.io/projected/c82c3040-48ed-473b-9386-d58d13364f29-kube-api-access-wrct5\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: E0307 04:25:19.635686 4689 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5e1d6e56bca9d9d91c9496d08294c8759b8b4423f9fd884ba459cdde6a3fdbe is running failed: container process not found" containerID="c5e1d6e56bca9d9d91c9496d08294c8759b8b4423f9fd884ba459cdde6a3fdbe" cmd=["grpc_health_probe","-addr=:50051"] Mar 07 04:25:19 crc kubenswrapper[4689]: E0307 04:25:19.636215 4689 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5e1d6e56bca9d9d91c9496d08294c8759b8b4423f9fd884ba459cdde6a3fdbe is running failed: container process not found" containerID="c5e1d6e56bca9d9d91c9496d08294c8759b8b4423f9fd884ba459cdde6a3fdbe" cmd=["grpc_health_probe","-addr=:50051"] Mar 07 04:25:19 crc kubenswrapper[4689]: E0307 04:25:19.639642 4689 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5e1d6e56bca9d9d91c9496d08294c8759b8b4423f9fd884ba459cdde6a3fdbe is running failed: container process not found" containerID="c5e1d6e56bca9d9d91c9496d08294c8759b8b4423f9fd884ba459cdde6a3fdbe" cmd=["grpc_health_probe","-addr=:50051"] Mar 07 04:25:19 crc kubenswrapper[4689]: E0307 04:25:19.639688 4689 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5e1d6e56bca9d9d91c9496d08294c8759b8b4423f9fd884ba459cdde6a3fdbe is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-2wh2s" podUID="98a53e64-9323-454c-9de0-a8d348182a64" containerName="registry-server" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.702759 4689 generic.go:334] "Generic (PLEG): container finished" podID="c82c3040-48ed-473b-9386-d58d13364f29" containerID="5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d" exitCode=0 Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.702876 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmghp" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.703298 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmghp" event={"ID":"c82c3040-48ed-473b-9386-d58d13364f29","Type":"ContainerDied","Data":"5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d"} Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.703352 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmghp" event={"ID":"c82c3040-48ed-473b-9386-d58d13364f29","Type":"ContainerDied","Data":"6b09126c0edaa1f93a11d78c4a374f15c6f1c1d644b4cf7d6c40b284ad01e9b0"} Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.703371 4689 scope.go:117] "RemoveContainer" containerID="5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.705258 4689 generic.go:334] "Generic (PLEG): container finished" podID="76eea8f9-8567-496d-ac53-575a25a140de" containerID="d04fe09a98945d979d230cec89cd839e6b709af2391b7e38db6d8a25f8aab189" exitCode=0 Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.705312 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" event={"ID":"76eea8f9-8567-496d-ac53-575a25a140de","Type":"ContainerDied","Data":"d04fe09a98945d979d230cec89cd839e6b709af2391b7e38db6d8a25f8aab189"} Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.712638 4689 generic.go:334] "Generic (PLEG): container finished" podID="98a53e64-9323-454c-9de0-a8d348182a64" containerID="c5e1d6e56bca9d9d91c9496d08294c8759b8b4423f9fd884ba459cdde6a3fdbe" exitCode=0 Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.712742 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wh2s" event={"ID":"98a53e64-9323-454c-9de0-a8d348182a64","Type":"ContainerDied","Data":"c5e1d6e56bca9d9d91c9496d08294c8759b8b4423f9fd884ba459cdde6a3fdbe"} Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.716402 4689 generic.go:334] "Generic (PLEG): container finished" podID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" containerID="eea2a417227a70af19d80116d0b67a59e65a9773db20ae2254f9852d6a6682bd" exitCode=0 Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.716461 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgr9z" event={"ID":"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5","Type":"ContainerDied","Data":"eea2a417227a70af19d80116d0b67a59e65a9773db20ae2254f9852d6a6682bd"} Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.719094 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.719792 4689 generic.go:334] "Generic (PLEG): container finished" podID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" containerID="e688ca2a50484d2e27ff1b3acc6a67a90b679e4083c06eccffb6c2e49b5a4484" exitCode=0 Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.719820 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvrwc" event={"ID":"fd0c8e82-4247-4dbb-b1a5-4a258259199c","Type":"ContainerDied","Data":"e688ca2a50484d2e27ff1b3acc6a67a90b679e4083c06eccffb6c2e49b5a4484"} Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.727159 4689 scope.go:117] "RemoveContainer" containerID="1224fb20657f8194ecfb5b6dfde07a1c63f2de0452232a2c39ea89a80496b2f7" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.737932 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.750816 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.757222 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.761260 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fmghp"] Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.772383 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76eea8f9-8567-496d-ac53-575a25a140de-marketplace-trusted-ca\") pod \"76eea8f9-8567-496d-ac53-575a25a140de\" (UID: \"76eea8f9-8567-496d-ac53-575a25a140de\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.772446 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgwsh\" (UniqueName: \"kubernetes.io/projected/fd0c8e82-4247-4dbb-b1a5-4a258259199c-kube-api-access-bgwsh\") pod \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\" (UID: \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.772518 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98a53e64-9323-454c-9de0-a8d348182a64-utilities\") pod \"98a53e64-9323-454c-9de0-a8d348182a64\" (UID: \"98a53e64-9323-454c-9de0-a8d348182a64\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.772545 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd0c8e82-4247-4dbb-b1a5-4a258259199c-catalog-content\") pod \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\" (UID: \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.772592 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cptc\" (UniqueName: \"kubernetes.io/projected/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-kube-api-access-4cptc\") pod \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\" (UID: \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.772612 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npnd9\" (UniqueName: \"kubernetes.io/projected/76eea8f9-8567-496d-ac53-575a25a140de-kube-api-access-npnd9\") pod \"76eea8f9-8567-496d-ac53-575a25a140de\" (UID: \"76eea8f9-8567-496d-ac53-575a25a140de\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.772639 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-utilities\") pod \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\" (UID: \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.772665 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-catalog-content\") pod \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\" (UID: \"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.772686 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/76eea8f9-8567-496d-ac53-575a25a140de-marketplace-operator-metrics\") pod \"76eea8f9-8567-496d-ac53-575a25a140de\" (UID: \"76eea8f9-8567-496d-ac53-575a25a140de\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.772705 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd0c8e82-4247-4dbb-b1a5-4a258259199c-utilities\") pod \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\" (UID: \"fd0c8e82-4247-4dbb-b1a5-4a258259199c\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.772739 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98a53e64-9323-454c-9de0-a8d348182a64-catalog-content\") pod \"98a53e64-9323-454c-9de0-a8d348182a64\" (UID: \"98a53e64-9323-454c-9de0-a8d348182a64\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.772790 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwdzg\" (UniqueName: \"kubernetes.io/projected/98a53e64-9323-454c-9de0-a8d348182a64-kube-api-access-kwdzg\") pod \"98a53e64-9323-454c-9de0-a8d348182a64\" (UID: \"98a53e64-9323-454c-9de0-a8d348182a64\") " Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.773249 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76eea8f9-8567-496d-ac53-575a25a140de-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "76eea8f9-8567-496d-ac53-575a25a140de" (UID: "76eea8f9-8567-496d-ac53-575a25a140de"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.773776 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-utilities" (OuterVolumeSpecName: "utilities") pod "ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" (UID: "ec8159c9-c2bd-4af5-8b6b-b855bbd968a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.774235 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd0c8e82-4247-4dbb-b1a5-4a258259199c-utilities" (OuterVolumeSpecName: "utilities") pod "fd0c8e82-4247-4dbb-b1a5-4a258259199c" (UID: "fd0c8e82-4247-4dbb-b1a5-4a258259199c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.774515 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98a53e64-9323-454c-9de0-a8d348182a64-utilities" (OuterVolumeSpecName: "utilities") pod "98a53e64-9323-454c-9de0-a8d348182a64" (UID: "98a53e64-9323-454c-9de0-a8d348182a64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.777964 4689 scope.go:117] "RemoveContainer" containerID="f1211fd9ea075098f905d994f01f104a763b223a5e1c3297fc9cd8dacd6275f5" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.779521 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0c8e82-4247-4dbb-b1a5-4a258259199c-kube-api-access-bgwsh" (OuterVolumeSpecName: "kube-api-access-bgwsh") pod "fd0c8e82-4247-4dbb-b1a5-4a258259199c" (UID: "fd0c8e82-4247-4dbb-b1a5-4a258259199c"). InnerVolumeSpecName "kube-api-access-bgwsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.783961 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76eea8f9-8567-496d-ac53-575a25a140de-kube-api-access-npnd9" (OuterVolumeSpecName: "kube-api-access-npnd9") pod "76eea8f9-8567-496d-ac53-575a25a140de" (UID: "76eea8f9-8567-496d-ac53-575a25a140de"). InnerVolumeSpecName "kube-api-access-npnd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.794290 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fmghp"] Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.795101 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98a53e64-9323-454c-9de0-a8d348182a64-kube-api-access-kwdzg" (OuterVolumeSpecName: "kube-api-access-kwdzg") pod "98a53e64-9323-454c-9de0-a8d348182a64" (UID: "98a53e64-9323-454c-9de0-a8d348182a64"). InnerVolumeSpecName "kube-api-access-kwdzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.798927 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76eea8f9-8567-496d-ac53-575a25a140de-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "76eea8f9-8567-496d-ac53-575a25a140de" (UID: "76eea8f9-8567-496d-ac53-575a25a140de"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.800222 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-kube-api-access-4cptc" (OuterVolumeSpecName: "kube-api-access-4cptc") pod "ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" (UID: "ec8159c9-c2bd-4af5-8b6b-b855bbd968a5"). InnerVolumeSpecName "kube-api-access-4cptc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.806738 4689 scope.go:117] "RemoveContainer" containerID="5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d" Mar 07 04:25:19 crc kubenswrapper[4689]: E0307 04:25:19.807374 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d\": container with ID starting with 5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d not found: ID does not exist" containerID="5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.807422 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d"} err="failed to get container status \"5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d\": rpc error: code = NotFound desc = could not find container \"5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d\": container with ID starting with 5a7e131aef94a0000ce5d7f7dcff2e18698e894e123a4b4c7b5bb4eab088762d not found: ID does not exist" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.807458 4689 scope.go:117] "RemoveContainer" containerID="1224fb20657f8194ecfb5b6dfde07a1c63f2de0452232a2c39ea89a80496b2f7" Mar 07 04:25:19 crc kubenswrapper[4689]: E0307 04:25:19.808134 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1224fb20657f8194ecfb5b6dfde07a1c63f2de0452232a2c39ea89a80496b2f7\": container with ID starting with 1224fb20657f8194ecfb5b6dfde07a1c63f2de0452232a2c39ea89a80496b2f7 not found: ID does not exist" containerID="1224fb20657f8194ecfb5b6dfde07a1c63f2de0452232a2c39ea89a80496b2f7" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.808236 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1224fb20657f8194ecfb5b6dfde07a1c63f2de0452232a2c39ea89a80496b2f7"} err="failed to get container status \"1224fb20657f8194ecfb5b6dfde07a1c63f2de0452232a2c39ea89a80496b2f7\": rpc error: code = NotFound desc = could not find container \"1224fb20657f8194ecfb5b6dfde07a1c63f2de0452232a2c39ea89a80496b2f7\": container with ID starting with 1224fb20657f8194ecfb5b6dfde07a1c63f2de0452232a2c39ea89a80496b2f7 not found: ID does not exist" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.808266 4689 scope.go:117] "RemoveContainer" containerID="f1211fd9ea075098f905d994f01f104a763b223a5e1c3297fc9cd8dacd6275f5" Mar 07 04:25:19 crc kubenswrapper[4689]: E0307 04:25:19.811820 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1211fd9ea075098f905d994f01f104a763b223a5e1c3297fc9cd8dacd6275f5\": container with ID starting with f1211fd9ea075098f905d994f01f104a763b223a5e1c3297fc9cd8dacd6275f5 not found: ID does not exist" containerID="f1211fd9ea075098f905d994f01f104a763b223a5e1c3297fc9cd8dacd6275f5" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.811928 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1211fd9ea075098f905d994f01f104a763b223a5e1c3297fc9cd8dacd6275f5"} err="failed to get container status \"f1211fd9ea075098f905d994f01f104a763b223a5e1c3297fc9cd8dacd6275f5\": rpc error: code = NotFound desc = could not find container \"f1211fd9ea075098f905d994f01f104a763b223a5e1c3297fc9cd8dacd6275f5\": container with ID starting with f1211fd9ea075098f905d994f01f104a763b223a5e1c3297fc9cd8dacd6275f5 not found: ID does not exist" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.825902 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" (UID: "ec8159c9-c2bd-4af5-8b6b-b855bbd968a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.831662 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c82c3040-48ed-473b-9386-d58d13364f29" path="/var/lib/kubelet/pods/c82c3040-48ed-473b-9386-d58d13364f29/volumes" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.866869 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd0c8e82-4247-4dbb-b1a5-4a258259199c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd0c8e82-4247-4dbb-b1a5-4a258259199c" (UID: "fd0c8e82-4247-4dbb-b1a5-4a258259199c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.874419 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cptc\" (UniqueName: \"kubernetes.io/projected/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-kube-api-access-4cptc\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.874450 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npnd9\" (UniqueName: \"kubernetes.io/projected/76eea8f9-8567-496d-ac53-575a25a140de-kube-api-access-npnd9\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.874463 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.874477 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.874489 4689 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/76eea8f9-8567-496d-ac53-575a25a140de-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.874500 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd0c8e82-4247-4dbb-b1a5-4a258259199c-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.874513 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwdzg\" (UniqueName: \"kubernetes.io/projected/98a53e64-9323-454c-9de0-a8d348182a64-kube-api-access-kwdzg\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.874523 4689 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76eea8f9-8567-496d-ac53-575a25a140de-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.874534 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgwsh\" (UniqueName: \"kubernetes.io/projected/fd0c8e82-4247-4dbb-b1a5-4a258259199c-kube-api-access-bgwsh\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.874544 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98a53e64-9323-454c-9de0-a8d348182a64-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.874554 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd0c8e82-4247-4dbb-b1a5-4a258259199c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.918052 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2qc7s"] Mar 07 04:25:19 crc kubenswrapper[4689]: I0307 04:25:19.984124 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98a53e64-9323-454c-9de0-a8d348182a64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98a53e64-9323-454c-9de0-a8d348182a64" (UID: "98a53e64-9323-454c-9de0-a8d348182a64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.075876 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98a53e64-9323-454c-9de0-a8d348182a64-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.232424 4689 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.232690 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://b209f86d0eb63aa69ca20da8da46cac738229b9dbb0c6e6e9671607dab93d1f7" gracePeriod=5 Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.725298 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" event={"ID":"431626bb-08c9-4190-83e1-d4d5fd7cb198","Type":"ContainerStarted","Data":"896a491962b320ac32f83a4d4d7eebf55f5b283444dde6371ee149237c5cffa0"} Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.725341 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" event={"ID":"431626bb-08c9-4190-83e1-d4d5fd7cb198","Type":"ContainerStarted","Data":"606b5bf77cc2151f3a56bc5a7ec7a2cb9067c63357b328580d0adfca57b0bda6"} Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.725507 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.728886 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.728906 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvrwc" event={"ID":"fd0c8e82-4247-4dbb-b1a5-4a258259199c","Type":"ContainerDied","Data":"cac016d6a5f03f781e64c5f0cdc3e8efc9459ca6c80cb65c99555e2aaafd121e"} Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.728927 4689 scope.go:117] "RemoveContainer" containerID="e688ca2a50484d2e27ff1b3acc6a67a90b679e4083c06eccffb6c2e49b5a4484" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.728996 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvrwc" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.732068 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" event={"ID":"76eea8f9-8567-496d-ac53-575a25a140de","Type":"ContainerDied","Data":"30241699ad5af1473e6b470ac38732b5128c4f0f066391f74df95aba3eab4101"} Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.732128 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m4p5r" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.734221 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wh2s" event={"ID":"98a53e64-9323-454c-9de0-a8d348182a64","Type":"ContainerDied","Data":"70200621582e8c6967da7ae5719b2d3dd116dc355d3f0d6614fc2d0bebc9fe7f"} Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.734336 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wh2s" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.740513 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgr9z" event={"ID":"ec8159c9-c2bd-4af5-8b6b-b855bbd968a5","Type":"ContainerDied","Data":"df30dd16acf27cd274071fd763c43d107b958ec2a40e5b89d7efd4ed889dca7c"} Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.740630 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgr9z" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.748427 4689 scope.go:117] "RemoveContainer" containerID="7282802cb62b4284bd840ee5b1d33d0e80c19fe39a202ae86ce039eea1475e73" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.761335 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2qc7s" podStartSLOduration=2.7613157 podStartE2EDuration="2.7613157s" podCreationTimestamp="2026-03-07 04:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:25:20.758324029 +0000 UTC m=+365.804707558" watchObservedRunningTime="2026-03-07 04:25:20.7613157 +0000 UTC m=+365.807699199" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.789059 4689 scope.go:117] "RemoveContainer" containerID="74a5bd36ad9cbee2b5db5dc80ee5f3d0aa4c1e2609587afa354012efcdd3e6f4" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.808237 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgr9z"] Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.818976 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgr9z"] Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.823183 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m4p5r"] Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.839037 4689 scope.go:117] "RemoveContainer" containerID="d04fe09a98945d979d230cec89cd839e6b709af2391b7e38db6d8a25f8aab189" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.843211 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m4p5r"] Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.848270 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvrwc"] Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.848983 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hvrwc"] Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.851653 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2wh2s"] Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.854130 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2wh2s"] Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.866953 4689 scope.go:117] "RemoveContainer" containerID="c5e1d6e56bca9d9d91c9496d08294c8759b8b4423f9fd884ba459cdde6a3fdbe" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.881143 4689 scope.go:117] "RemoveContainer" containerID="f5182ccad7ad084666ecb485eae286df8d43f273e433ae0b099873949b374b47" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.897115 4689 scope.go:117] "RemoveContainer" containerID="82c1bbc770916edd38d9cea02243cda9697bd464004b82f2544abae9b19d0f92" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.909005 4689 scope.go:117] "RemoveContainer" containerID="eea2a417227a70af19d80116d0b67a59e65a9773db20ae2254f9852d6a6682bd" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.923685 4689 scope.go:117] "RemoveContainer" containerID="d6f4f56df28874c6cec59e556d26e3408f2303402255b872cd72b0e4ccfcc540" Mar 07 04:25:20 crc kubenswrapper[4689]: I0307 04:25:20.934324 4689 scope.go:117] "RemoveContainer" containerID="91b958974b11a0c2309684b1fcb2ec6cca548d20a31245933502d28061c0d57f" Mar 07 04:25:21 crc kubenswrapper[4689]: I0307 04:25:21.838123 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76eea8f9-8567-496d-ac53-575a25a140de" path="/var/lib/kubelet/pods/76eea8f9-8567-496d-ac53-575a25a140de/volumes" Mar 07 04:25:21 crc kubenswrapper[4689]: I0307 04:25:21.838890 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98a53e64-9323-454c-9de0-a8d348182a64" path="/var/lib/kubelet/pods/98a53e64-9323-454c-9de0-a8d348182a64/volumes" Mar 07 04:25:21 crc kubenswrapper[4689]: I0307 04:25:21.839450 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" path="/var/lib/kubelet/pods/ec8159c9-c2bd-4af5-8b6b-b855bbd968a5/volumes" Mar 07 04:25:21 crc kubenswrapper[4689]: I0307 04:25:21.840516 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" path="/var/lib/kubelet/pods/fd0c8e82-4247-4dbb-b1a5-4a258259199c/volumes" Mar 07 04:25:25 crc kubenswrapper[4689]: I0307 04:25:25.781072 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 07 04:25:25 crc kubenswrapper[4689]: I0307 04:25:25.781194 4689 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="b209f86d0eb63aa69ca20da8da46cac738229b9dbb0c6e6e9671607dab93d1f7" exitCode=137 Mar 07 04:25:25 crc kubenswrapper[4689]: I0307 04:25:25.958306 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 07 04:25:25 crc kubenswrapper[4689]: I0307 04:25:25.958451 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.051027 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.051141 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.051302 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.051380 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.051458 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.051895 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.051930 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.051980 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.052004 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.066695 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.153008 4689 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.153404 4689 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.153424 4689 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.153443 4689 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.153459 4689 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.791196 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.791285 4689 scope.go:117] "RemoveContainer" containerID="b209f86d0eb63aa69ca20da8da46cac738229b9dbb0c6e6e9671607dab93d1f7" Mar 07 04:25:26 crc kubenswrapper[4689]: I0307 04:25:26.791430 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 04:25:27 crc kubenswrapper[4689]: I0307 04:25:27.837119 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 07 04:25:50 crc kubenswrapper[4689]: I0307 04:25:50.229687 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fcn6x"] Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.198313 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547626-qtklx"] Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199157 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" containerName="extract-utilities" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199190 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" containerName="extract-utilities" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199201 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" containerName="extract-content" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199208 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" containerName="extract-content" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199221 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a53e64-9323-454c-9de0-a8d348182a64" containerName="registry-server" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199228 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a53e64-9323-454c-9de0-a8d348182a64" containerName="registry-server" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199239 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76eea8f9-8567-496d-ac53-575a25a140de" containerName="marketplace-operator" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199247 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="76eea8f9-8567-496d-ac53-575a25a140de" containerName="marketplace-operator" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199258 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82c3040-48ed-473b-9386-d58d13364f29" containerName="registry-server" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199265 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82c3040-48ed-473b-9386-d58d13364f29" containerName="registry-server" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199277 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82c3040-48ed-473b-9386-d58d13364f29" containerName="extract-content" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199284 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82c3040-48ed-473b-9386-d58d13364f29" containerName="extract-content" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199293 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a53e64-9323-454c-9de0-a8d348182a64" containerName="extract-utilities" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199299 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a53e64-9323-454c-9de0-a8d348182a64" containerName="extract-utilities" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199305 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" containerName="extract-content" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199313 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" containerName="extract-content" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199322 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82c3040-48ed-473b-9386-d58d13364f29" containerName="extract-utilities" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199330 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82c3040-48ed-473b-9386-d58d13364f29" containerName="extract-utilities" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199341 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" containerName="registry-server" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199348 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" containerName="registry-server" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199360 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" containerName="registry-server" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199368 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" containerName="registry-server" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199380 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a53e64-9323-454c-9de0-a8d348182a64" containerName="extract-content" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199387 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a53e64-9323-454c-9de0-a8d348182a64" containerName="extract-content" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199397 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" containerName="extract-utilities" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199405 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" containerName="extract-utilities" Mar 07 04:26:00 crc kubenswrapper[4689]: E0307 04:26:00.199413 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199419 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199521 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="c82c3040-48ed-473b-9386-d58d13364f29" containerName="registry-server" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199533 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="76eea8f9-8567-496d-ac53-575a25a140de" containerName="marketplace-operator" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199544 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8159c9-c2bd-4af5-8b6b-b855bbd968a5" containerName="registry-server" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199556 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="98a53e64-9323-454c-9de0-a8d348182a64" containerName="registry-server" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199567 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0c8e82-4247-4dbb-b1a5-4a258259199c" containerName="registry-server" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.199577 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.200047 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547626-qtklx" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.203024 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547626-qtklx"] Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.203192 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.203658 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.205909 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.299716 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4swg9\" (UniqueName: \"kubernetes.io/projected/c64cdb5e-38bb-478e-8fcc-54fb0a234918-kube-api-access-4swg9\") pod \"auto-csr-approver-29547626-qtklx\" (UID: \"c64cdb5e-38bb-478e-8fcc-54fb0a234918\") " pod="openshift-infra/auto-csr-approver-29547626-qtklx" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.401354 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4swg9\" (UniqueName: \"kubernetes.io/projected/c64cdb5e-38bb-478e-8fcc-54fb0a234918-kube-api-access-4swg9\") pod \"auto-csr-approver-29547626-qtklx\" (UID: \"c64cdb5e-38bb-478e-8fcc-54fb0a234918\") " pod="openshift-infra/auto-csr-approver-29547626-qtklx" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.421231 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4swg9\" (UniqueName: \"kubernetes.io/projected/c64cdb5e-38bb-478e-8fcc-54fb0a234918-kube-api-access-4swg9\") pod \"auto-csr-approver-29547626-qtklx\" (UID: \"c64cdb5e-38bb-478e-8fcc-54fb0a234918\") " pod="openshift-infra/auto-csr-approver-29547626-qtklx" Mar 07 04:26:00 crc kubenswrapper[4689]: I0307 04:26:00.524717 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547626-qtklx" Mar 07 04:26:01 crc kubenswrapper[4689]: I0307 04:26:01.037058 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547626-qtklx"] Mar 07 04:26:02 crc kubenswrapper[4689]: I0307 04:26:02.025713 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547626-qtklx" event={"ID":"c64cdb5e-38bb-478e-8fcc-54fb0a234918","Type":"ContainerStarted","Data":"3f2f662e7da9df21723aa8843d7122f1d2381dcf01f0a56bfae4bbb1e0988485"} Mar 07 04:26:03 crc kubenswrapper[4689]: I0307 04:26:03.033350 4689 generic.go:334] "Generic (PLEG): container finished" podID="c64cdb5e-38bb-478e-8fcc-54fb0a234918" containerID="044b0606461e6ef3ea35a49511ea31bbbc16ced31b504b18a99d6cea618859e7" exitCode=0 Mar 07 04:26:03 crc kubenswrapper[4689]: I0307 04:26:03.033410 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547626-qtklx" event={"ID":"c64cdb5e-38bb-478e-8fcc-54fb0a234918","Type":"ContainerDied","Data":"044b0606461e6ef3ea35a49511ea31bbbc16ced31b504b18a99d6cea618859e7"} Mar 07 04:26:04 crc kubenswrapper[4689]: I0307 04:26:04.330264 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547626-qtklx" Mar 07 04:26:04 crc kubenswrapper[4689]: I0307 04:26:04.356793 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4swg9\" (UniqueName: \"kubernetes.io/projected/c64cdb5e-38bb-478e-8fcc-54fb0a234918-kube-api-access-4swg9\") pod \"c64cdb5e-38bb-478e-8fcc-54fb0a234918\" (UID: \"c64cdb5e-38bb-478e-8fcc-54fb0a234918\") " Mar 07 04:26:04 crc kubenswrapper[4689]: I0307 04:26:04.369746 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c64cdb5e-38bb-478e-8fcc-54fb0a234918-kube-api-access-4swg9" (OuterVolumeSpecName: "kube-api-access-4swg9") pod "c64cdb5e-38bb-478e-8fcc-54fb0a234918" (UID: "c64cdb5e-38bb-478e-8fcc-54fb0a234918"). InnerVolumeSpecName "kube-api-access-4swg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:26:04 crc kubenswrapper[4689]: I0307 04:26:04.465995 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4swg9\" (UniqueName: \"kubernetes.io/projected/c64cdb5e-38bb-478e-8fcc-54fb0a234918-kube-api-access-4swg9\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:05 crc kubenswrapper[4689]: I0307 04:26:05.046777 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547626-qtklx" event={"ID":"c64cdb5e-38bb-478e-8fcc-54fb0a234918","Type":"ContainerDied","Data":"3f2f662e7da9df21723aa8843d7122f1d2381dcf01f0a56bfae4bbb1e0988485"} Mar 07 04:26:05 crc kubenswrapper[4689]: I0307 04:26:05.047098 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f2f662e7da9df21723aa8843d7122f1d2381dcf01f0a56bfae4bbb1e0988485" Mar 07 04:26:05 crc kubenswrapper[4689]: I0307 04:26:05.046884 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547626-qtklx" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.278575 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" podUID="e4c3b676-f7ae-4659-a3f6-73dcc319bed8" containerName="oauth-openshift" containerID="cri-o://e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3" gracePeriod=15 Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.729038 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.764161 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6"] Mar 07 04:26:15 crc kubenswrapper[4689]: E0307 04:26:15.764398 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64cdb5e-38bb-478e-8fcc-54fb0a234918" containerName="oc" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.764412 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64cdb5e-38bb-478e-8fcc-54fb0a234918" containerName="oc" Mar 07 04:26:15 crc kubenswrapper[4689]: E0307 04:26:15.764427 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c3b676-f7ae-4659-a3f6-73dcc319bed8" containerName="oauth-openshift" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.764436 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c3b676-f7ae-4659-a3f6-73dcc319bed8" containerName="oauth-openshift" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.764545 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="c64cdb5e-38bb-478e-8fcc-54fb0a234918" containerName="oc" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.764569 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c3b676-f7ae-4659-a3f6-73dcc319bed8" containerName="oauth-openshift" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.765000 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827057 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-serving-cert\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827109 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-login\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827134 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5t86\" (UniqueName: \"kubernetes.io/projected/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-kube-api-access-w5t86\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827164 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-cliconfig\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827205 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-service-ca\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827236 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-provider-selection\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827267 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-error\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827298 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-router-certs\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827333 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-ocp-branding-template\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827357 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-idp-0-file-data\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827380 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-audit-policies\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827403 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-session\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827425 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-audit-dir\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827455 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-trusted-ca-bundle\") pod \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\" (UID: \"e4c3b676-f7ae-4659-a3f6-73dcc319bed8\") " Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827561 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827591 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-user-template-login\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827622 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827649 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-service-ca\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827674 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827697 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-router-certs\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827720 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827752 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-session\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827778 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-user-template-error\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827800 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-audit-dir\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827824 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzqwg\" (UniqueName: \"kubernetes.io/projected/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-kube-api-access-bzqwg\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827852 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827885 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-audit-policies\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.827906 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.828130 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.828721 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.829069 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.829434 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.829888 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.834859 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-kube-api-access-w5t86" (OuterVolumeSpecName: "kube-api-access-w5t86") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "kube-api-access-w5t86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.835033 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.837286 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.837587 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.837851 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.838265 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.838512 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.838662 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.846579 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e4c3b676-f7ae-4659-a3f6-73dcc319bed8" (UID: "e4c3b676-f7ae-4659-a3f6-73dcc319bed8"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.868152 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6"] Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928404 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-audit-dir\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928466 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-user-template-error\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928489 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzqwg\" (UniqueName: \"kubernetes.io/projected/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-kube-api-access-bzqwg\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928514 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928541 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-audit-policies\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928560 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928585 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928603 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-user-template-login\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928593 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-audit-dir\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928625 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928712 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-service-ca\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928744 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928765 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-router-certs\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928797 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928853 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-session\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928923 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928937 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928950 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5t86\" (UniqueName: \"kubernetes.io/projected/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-kube-api-access-w5t86\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928963 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928975 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.928990 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.929001 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.929014 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.929026 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.929037 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.929050 4689 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.929062 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.929075 4689 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.929086 4689 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4c3b676-f7ae-4659-a3f6-73dcc319bed8-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.929444 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-service-ca\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.930294 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-audit-policies\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.932801 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.933295 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.933904 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.934154 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.934246 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-user-template-error\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.934777 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.934828 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-user-template-login\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.935501 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.935669 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-session\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.940483 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-v4-0-config-system-router-certs\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:15 crc kubenswrapper[4689]: I0307 04:26:15.956336 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzqwg\" (UniqueName: \"kubernetes.io/projected/312ffdef-12c2-4c4a-ad1e-6b508fc86c49-kube-api-access-bzqwg\") pod \"oauth-openshift-5584c6b7fb-fwkv6\" (UID: \"312ffdef-12c2-4c4a-ad1e-6b508fc86c49\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:16 crc kubenswrapper[4689]: I0307 04:26:16.109796 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:16 crc kubenswrapper[4689]: I0307 04:26:16.117022 4689 generic.go:334] "Generic (PLEG): container finished" podID="e4c3b676-f7ae-4659-a3f6-73dcc319bed8" containerID="e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3" exitCode=0 Mar 07 04:26:16 crc kubenswrapper[4689]: I0307 04:26:16.117096 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" Mar 07 04:26:16 crc kubenswrapper[4689]: I0307 04:26:16.117190 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" event={"ID":"e4c3b676-f7ae-4659-a3f6-73dcc319bed8","Type":"ContainerDied","Data":"e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3"} Mar 07 04:26:16 crc kubenswrapper[4689]: I0307 04:26:16.117276 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fcn6x" event={"ID":"e4c3b676-f7ae-4659-a3f6-73dcc319bed8","Type":"ContainerDied","Data":"3899f63dc0cadd52c511a0d4db8be92b4031bd10eb64625ad6cd57c7721a2027"} Mar 07 04:26:16 crc kubenswrapper[4689]: I0307 04:26:16.117316 4689 scope.go:117] "RemoveContainer" containerID="e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3" Mar 07 04:26:16 crc kubenswrapper[4689]: I0307 04:26:16.157579 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fcn6x"] Mar 07 04:26:16 crc kubenswrapper[4689]: I0307 04:26:16.159228 4689 scope.go:117] "RemoveContainer" containerID="e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3" Mar 07 04:26:16 crc kubenswrapper[4689]: E0307 04:26:16.159730 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3\": container with ID starting with e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3 not found: ID does not exist" containerID="e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3" Mar 07 04:26:16 crc kubenswrapper[4689]: I0307 04:26:16.159788 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3"} err="failed to get container status \"e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3\": rpc error: code = NotFound desc = could not find container \"e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3\": container with ID starting with e11173323ddc837f377452960fe745a44347734c3bc1900c0e8318b4d98735d3 not found: ID does not exist" Mar 07 04:26:16 crc kubenswrapper[4689]: I0307 04:26:16.162536 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fcn6x"] Mar 07 04:26:16 crc kubenswrapper[4689]: I0307 04:26:16.393635 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6"] Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.126537 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" event={"ID":"312ffdef-12c2-4c4a-ad1e-6b508fc86c49","Type":"ContainerStarted","Data":"8941f7509fdf13aeb054eda76d29d2226e619a7be144a5de9b8be947a0ab485f"} Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.126606 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" event={"ID":"312ffdef-12c2-4c4a-ad1e-6b508fc86c49","Type":"ContainerStarted","Data":"0c4bbe46fe36382089cdd196bfa62defe23daaced27dd389b91eca7fffcb5b0f"} Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.127025 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.143880 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.163422 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5584c6b7fb-fwkv6" podStartSLOduration=27.163389918 podStartE2EDuration="27.163389918s" podCreationTimestamp="2026-03-07 04:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:26:17.153989343 +0000 UTC m=+422.200372872" watchObservedRunningTime="2026-03-07 04:26:17.163389918 +0000 UTC m=+422.209773447" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.704239 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8cclj"] Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.706084 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.713648 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.723273 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8cclj"] Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.837580 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4c3b676-f7ae-4659-a3f6-73dcc319bed8" path="/var/lib/kubelet/pods/e4c3b676-f7ae-4659-a3f6-73dcc319bed8/volumes" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.858107 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769da50c-a6db-491d-90d7-146ac186dad8-utilities\") pod \"redhat-operators-8cclj\" (UID: \"769da50c-a6db-491d-90d7-146ac186dad8\") " pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.858535 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769da50c-a6db-491d-90d7-146ac186dad8-catalog-content\") pod \"redhat-operators-8cclj\" (UID: \"769da50c-a6db-491d-90d7-146ac186dad8\") " pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.858703 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwm6z\" (UniqueName: \"kubernetes.io/projected/769da50c-a6db-491d-90d7-146ac186dad8-kube-api-access-hwm6z\") pod \"redhat-operators-8cclj\" (UID: \"769da50c-a6db-491d-90d7-146ac186dad8\") " pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.903797 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4qkv8"] Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.904971 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.907966 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.920569 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4qkv8"] Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.959683 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769da50c-a6db-491d-90d7-146ac186dad8-utilities\") pod \"redhat-operators-8cclj\" (UID: \"769da50c-a6db-491d-90d7-146ac186dad8\") " pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.959790 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769da50c-a6db-491d-90d7-146ac186dad8-catalog-content\") pod \"redhat-operators-8cclj\" (UID: \"769da50c-a6db-491d-90d7-146ac186dad8\") " pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.959854 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwm6z\" (UniqueName: \"kubernetes.io/projected/769da50c-a6db-491d-90d7-146ac186dad8-kube-api-access-hwm6z\") pod \"redhat-operators-8cclj\" (UID: \"769da50c-a6db-491d-90d7-146ac186dad8\") " pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.961832 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769da50c-a6db-491d-90d7-146ac186dad8-utilities\") pod \"redhat-operators-8cclj\" (UID: \"769da50c-a6db-491d-90d7-146ac186dad8\") " pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.962325 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769da50c-a6db-491d-90d7-146ac186dad8-catalog-content\") pod \"redhat-operators-8cclj\" (UID: \"769da50c-a6db-491d-90d7-146ac186dad8\") " pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:17 crc kubenswrapper[4689]: I0307 04:26:17.988094 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwm6z\" (UniqueName: \"kubernetes.io/projected/769da50c-a6db-491d-90d7-146ac186dad8-kube-api-access-hwm6z\") pod \"redhat-operators-8cclj\" (UID: \"769da50c-a6db-491d-90d7-146ac186dad8\") " pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.032163 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.060759 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5bmj\" (UniqueName: \"kubernetes.io/projected/4a76727a-27ba-4d05-92cf-01ec595c6989-kube-api-access-f5bmj\") pod \"community-operators-4qkv8\" (UID: \"4a76727a-27ba-4d05-92cf-01ec595c6989\") " pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.060845 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a76727a-27ba-4d05-92cf-01ec595c6989-utilities\") pod \"community-operators-4qkv8\" (UID: \"4a76727a-27ba-4d05-92cf-01ec595c6989\") " pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.060880 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a76727a-27ba-4d05-92cf-01ec595c6989-catalog-content\") pod \"community-operators-4qkv8\" (UID: \"4a76727a-27ba-4d05-92cf-01ec595c6989\") " pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.161982 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5bmj\" (UniqueName: \"kubernetes.io/projected/4a76727a-27ba-4d05-92cf-01ec595c6989-kube-api-access-f5bmj\") pod \"community-operators-4qkv8\" (UID: \"4a76727a-27ba-4d05-92cf-01ec595c6989\") " pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.162263 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a76727a-27ba-4d05-92cf-01ec595c6989-utilities\") pod \"community-operators-4qkv8\" (UID: \"4a76727a-27ba-4d05-92cf-01ec595c6989\") " pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.162282 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a76727a-27ba-4d05-92cf-01ec595c6989-catalog-content\") pod \"community-operators-4qkv8\" (UID: \"4a76727a-27ba-4d05-92cf-01ec595c6989\") " pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.162940 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a76727a-27ba-4d05-92cf-01ec595c6989-catalog-content\") pod \"community-operators-4qkv8\" (UID: \"4a76727a-27ba-4d05-92cf-01ec595c6989\") " pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.163004 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a76727a-27ba-4d05-92cf-01ec595c6989-utilities\") pod \"community-operators-4qkv8\" (UID: \"4a76727a-27ba-4d05-92cf-01ec595c6989\") " pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.188205 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5bmj\" (UniqueName: \"kubernetes.io/projected/4a76727a-27ba-4d05-92cf-01ec595c6989-kube-api-access-f5bmj\") pod \"community-operators-4qkv8\" (UID: \"4a76727a-27ba-4d05-92cf-01ec595c6989\") " pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.237739 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.448426 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8cclj"] Mar 07 04:26:18 crc kubenswrapper[4689]: W0307 04:26:18.457731 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod769da50c_a6db_491d_90d7_146ac186dad8.slice/crio-1df89e9f6e7c1f29e6f2dd52d102e19ebc629f9169abefa2b52a74f2d2c46456 WatchSource:0}: Error finding container 1df89e9f6e7c1f29e6f2dd52d102e19ebc629f9169abefa2b52a74f2d2c46456: Status 404 returned error can't find the container with id 1df89e9f6e7c1f29e6f2dd52d102e19ebc629f9169abefa2b52a74f2d2c46456 Mar 07 04:26:18 crc kubenswrapper[4689]: I0307 04:26:18.695271 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4qkv8"] Mar 07 04:26:18 crc kubenswrapper[4689]: W0307 04:26:18.734276 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a76727a_27ba_4d05_92cf_01ec595c6989.slice/crio-693914d510fda0dfd5e3b655453c1c1e6168ab8f2a0d0656a32959910dff31bf WatchSource:0}: Error finding container 693914d510fda0dfd5e3b655453c1c1e6168ab8f2a0d0656a32959910dff31bf: Status 404 returned error can't find the container with id 693914d510fda0dfd5e3b655453c1c1e6168ab8f2a0d0656a32959910dff31bf Mar 07 04:26:19 crc kubenswrapper[4689]: I0307 04:26:19.156752 4689 generic.go:334] "Generic (PLEG): container finished" podID="4a76727a-27ba-4d05-92cf-01ec595c6989" containerID="6e716fec92d6ce9e49480ac9cc3f87973f8ce9544382500837373917867802de" exitCode=0 Mar 07 04:26:19 crc kubenswrapper[4689]: I0307 04:26:19.156946 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qkv8" event={"ID":"4a76727a-27ba-4d05-92cf-01ec595c6989","Type":"ContainerDied","Data":"6e716fec92d6ce9e49480ac9cc3f87973f8ce9544382500837373917867802de"} Mar 07 04:26:19 crc kubenswrapper[4689]: I0307 04:26:19.156994 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qkv8" event={"ID":"4a76727a-27ba-4d05-92cf-01ec595c6989","Type":"ContainerStarted","Data":"693914d510fda0dfd5e3b655453c1c1e6168ab8f2a0d0656a32959910dff31bf"} Mar 07 04:26:19 crc kubenswrapper[4689]: I0307 04:26:19.161000 4689 generic.go:334] "Generic (PLEG): container finished" podID="769da50c-a6db-491d-90d7-146ac186dad8" containerID="fa467ca71995f51a2b091b8e214b110305cde707b45445ef084aadd928491064" exitCode=0 Mar 07 04:26:19 crc kubenswrapper[4689]: I0307 04:26:19.161236 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cclj" event={"ID":"769da50c-a6db-491d-90d7-146ac186dad8","Type":"ContainerDied","Data":"fa467ca71995f51a2b091b8e214b110305cde707b45445ef084aadd928491064"} Mar 07 04:26:19 crc kubenswrapper[4689]: I0307 04:26:19.161287 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cclj" event={"ID":"769da50c-a6db-491d-90d7-146ac186dad8","Type":"ContainerStarted","Data":"1df89e9f6e7c1f29e6f2dd52d102e19ebc629f9169abefa2b52a74f2d2c46456"} Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.098374 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4xc7w"] Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.100517 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.103271 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.113889 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xc7w"] Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.168816 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qkv8" event={"ID":"4a76727a-27ba-4d05-92cf-01ec595c6989","Type":"ContainerStarted","Data":"743cc435ca800dfaa9fdb919a39df9fc8a7a50e1c94ec94a2d6b5e55df77d45d"} Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.202986 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba6dceb-a52c-4108-af6e-ca861cdff2d9-utilities\") pod \"certified-operators-4xc7w\" (UID: \"8ba6dceb-a52c-4108-af6e-ca861cdff2d9\") " pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.203054 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba6dceb-a52c-4108-af6e-ca861cdff2d9-catalog-content\") pod \"certified-operators-4xc7w\" (UID: \"8ba6dceb-a52c-4108-af6e-ca861cdff2d9\") " pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.203104 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfl28\" (UniqueName: \"kubernetes.io/projected/8ba6dceb-a52c-4108-af6e-ca861cdff2d9-kube-api-access-zfl28\") pod \"certified-operators-4xc7w\" (UID: \"8ba6dceb-a52c-4108-af6e-ca861cdff2d9\") " pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.298610 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g6vbv"] Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.299788 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.302307 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.304037 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba6dceb-a52c-4108-af6e-ca861cdff2d9-utilities\") pod \"certified-operators-4xc7w\" (UID: \"8ba6dceb-a52c-4108-af6e-ca861cdff2d9\") " pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.304148 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba6dceb-a52c-4108-af6e-ca861cdff2d9-catalog-content\") pod \"certified-operators-4xc7w\" (UID: \"8ba6dceb-a52c-4108-af6e-ca861cdff2d9\") " pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.304250 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfl28\" (UniqueName: \"kubernetes.io/projected/8ba6dceb-a52c-4108-af6e-ca861cdff2d9-kube-api-access-zfl28\") pod \"certified-operators-4xc7w\" (UID: \"8ba6dceb-a52c-4108-af6e-ca861cdff2d9\") " pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.304544 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba6dceb-a52c-4108-af6e-ca861cdff2d9-utilities\") pod \"certified-operators-4xc7w\" (UID: \"8ba6dceb-a52c-4108-af6e-ca861cdff2d9\") " pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.304805 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba6dceb-a52c-4108-af6e-ca861cdff2d9-catalog-content\") pod \"certified-operators-4xc7w\" (UID: \"8ba6dceb-a52c-4108-af6e-ca861cdff2d9\") " pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.318687 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g6vbv"] Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.340556 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfl28\" (UniqueName: \"kubernetes.io/projected/8ba6dceb-a52c-4108-af6e-ca861cdff2d9-kube-api-access-zfl28\") pod \"certified-operators-4xc7w\" (UID: \"8ba6dceb-a52c-4108-af6e-ca861cdff2d9\") " pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.405830 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f58e77c1-4fe5-4b43-bd3c-babc094119f0-utilities\") pod \"redhat-marketplace-g6vbv\" (UID: \"f58e77c1-4fe5-4b43-bd3c-babc094119f0\") " pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.405968 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmllw\" (UniqueName: \"kubernetes.io/projected/f58e77c1-4fe5-4b43-bd3c-babc094119f0-kube-api-access-dmllw\") pod \"redhat-marketplace-g6vbv\" (UID: \"f58e77c1-4fe5-4b43-bd3c-babc094119f0\") " pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.406132 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f58e77c1-4fe5-4b43-bd3c-babc094119f0-catalog-content\") pod \"redhat-marketplace-g6vbv\" (UID: \"f58e77c1-4fe5-4b43-bd3c-babc094119f0\") " pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.462221 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.507290 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmllw\" (UniqueName: \"kubernetes.io/projected/f58e77c1-4fe5-4b43-bd3c-babc094119f0-kube-api-access-dmllw\") pod \"redhat-marketplace-g6vbv\" (UID: \"f58e77c1-4fe5-4b43-bd3c-babc094119f0\") " pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.507415 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f58e77c1-4fe5-4b43-bd3c-babc094119f0-catalog-content\") pod \"redhat-marketplace-g6vbv\" (UID: \"f58e77c1-4fe5-4b43-bd3c-babc094119f0\") " pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.507448 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f58e77c1-4fe5-4b43-bd3c-babc094119f0-utilities\") pod \"redhat-marketplace-g6vbv\" (UID: \"f58e77c1-4fe5-4b43-bd3c-babc094119f0\") " pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.507866 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f58e77c1-4fe5-4b43-bd3c-babc094119f0-catalog-content\") pod \"redhat-marketplace-g6vbv\" (UID: \"f58e77c1-4fe5-4b43-bd3c-babc094119f0\") " pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.507977 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f58e77c1-4fe5-4b43-bd3c-babc094119f0-utilities\") pod \"redhat-marketplace-g6vbv\" (UID: \"f58e77c1-4fe5-4b43-bd3c-babc094119f0\") " pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.526492 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmllw\" (UniqueName: \"kubernetes.io/projected/f58e77c1-4fe5-4b43-bd3c-babc094119f0-kube-api-access-dmllw\") pod \"redhat-marketplace-g6vbv\" (UID: \"f58e77c1-4fe5-4b43-bd3c-babc094119f0\") " pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.640958 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:20 crc kubenswrapper[4689]: I0307 04:26:20.950698 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xc7w"] Mar 07 04:26:20 crc kubenswrapper[4689]: W0307 04:26:20.976275 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ba6dceb_a52c_4108_af6e_ca861cdff2d9.slice/crio-379ac5a0f580751203ebad517787427668eeed946f9414a41b61148f561389ac WatchSource:0}: Error finding container 379ac5a0f580751203ebad517787427668eeed946f9414a41b61148f561389ac: Status 404 returned error can't find the container with id 379ac5a0f580751203ebad517787427668eeed946f9414a41b61148f561389ac Mar 07 04:26:21 crc kubenswrapper[4689]: I0307 04:26:21.048142 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g6vbv"] Mar 07 04:26:21 crc kubenswrapper[4689]: W0307 04:26:21.062551 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf58e77c1_4fe5_4b43_bd3c_babc094119f0.slice/crio-c73cf2bc654095b247fb77100c6395b6790f3838d2c35113d4ab5b85d6b77f92 WatchSource:0}: Error finding container c73cf2bc654095b247fb77100c6395b6790f3838d2c35113d4ab5b85d6b77f92: Status 404 returned error can't find the container with id c73cf2bc654095b247fb77100c6395b6790f3838d2c35113d4ab5b85d6b77f92 Mar 07 04:26:21 crc kubenswrapper[4689]: I0307 04:26:21.180216 4689 generic.go:334] "Generic (PLEG): container finished" podID="769da50c-a6db-491d-90d7-146ac186dad8" containerID="11398724f2527c68de8f78f5cec357a0a129406748d83ad6701781dc98467734" exitCode=0 Mar 07 04:26:21 crc kubenswrapper[4689]: I0307 04:26:21.180308 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cclj" event={"ID":"769da50c-a6db-491d-90d7-146ac186dad8","Type":"ContainerDied","Data":"11398724f2527c68de8f78f5cec357a0a129406748d83ad6701781dc98467734"} Mar 07 04:26:21 crc kubenswrapper[4689]: I0307 04:26:21.182051 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6vbv" event={"ID":"f58e77c1-4fe5-4b43-bd3c-babc094119f0","Type":"ContainerStarted","Data":"c73cf2bc654095b247fb77100c6395b6790f3838d2c35113d4ab5b85d6b77f92"} Mar 07 04:26:21 crc kubenswrapper[4689]: I0307 04:26:21.192149 4689 generic.go:334] "Generic (PLEG): container finished" podID="4a76727a-27ba-4d05-92cf-01ec595c6989" containerID="743cc435ca800dfaa9fdb919a39df9fc8a7a50e1c94ec94a2d6b5e55df77d45d" exitCode=0 Mar 07 04:26:21 crc kubenswrapper[4689]: I0307 04:26:21.192242 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qkv8" event={"ID":"4a76727a-27ba-4d05-92cf-01ec595c6989","Type":"ContainerDied","Data":"743cc435ca800dfaa9fdb919a39df9fc8a7a50e1c94ec94a2d6b5e55df77d45d"} Mar 07 04:26:21 crc kubenswrapper[4689]: I0307 04:26:21.199275 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xc7w" event={"ID":"8ba6dceb-a52c-4108-af6e-ca861cdff2d9","Type":"ContainerStarted","Data":"8840c22de54201f638395c24e332792421df9717c8f7872108127bdcd0a7c803"} Mar 07 04:26:21 crc kubenswrapper[4689]: I0307 04:26:21.199316 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xc7w" event={"ID":"8ba6dceb-a52c-4108-af6e-ca861cdff2d9","Type":"ContainerStarted","Data":"379ac5a0f580751203ebad517787427668eeed946f9414a41b61148f561389ac"} Mar 07 04:26:22 crc kubenswrapper[4689]: I0307 04:26:22.206800 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cclj" event={"ID":"769da50c-a6db-491d-90d7-146ac186dad8","Type":"ContainerStarted","Data":"59009ca6ba2c4e771a7e1af85a7e9aaf5cc52bb3e73467b082bb2554c7e287d0"} Mar 07 04:26:22 crc kubenswrapper[4689]: I0307 04:26:22.208735 4689 generic.go:334] "Generic (PLEG): container finished" podID="f58e77c1-4fe5-4b43-bd3c-babc094119f0" containerID="79c5a78cc15fb993720dc83eca4cd12f3688f1972dcd98f8bc454af02ba673d3" exitCode=0 Mar 07 04:26:22 crc kubenswrapper[4689]: I0307 04:26:22.208860 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6vbv" event={"ID":"f58e77c1-4fe5-4b43-bd3c-babc094119f0","Type":"ContainerDied","Data":"79c5a78cc15fb993720dc83eca4cd12f3688f1972dcd98f8bc454af02ba673d3"} Mar 07 04:26:22 crc kubenswrapper[4689]: I0307 04:26:22.212454 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qkv8" event={"ID":"4a76727a-27ba-4d05-92cf-01ec595c6989","Type":"ContainerStarted","Data":"c8ecee0a5cf2c44835483b50d2fb78d99bd7e97814778288fdca321eb1ada45d"} Mar 07 04:26:22 crc kubenswrapper[4689]: I0307 04:26:22.214351 4689 generic.go:334] "Generic (PLEG): container finished" podID="8ba6dceb-a52c-4108-af6e-ca861cdff2d9" containerID="8840c22de54201f638395c24e332792421df9717c8f7872108127bdcd0a7c803" exitCode=0 Mar 07 04:26:22 crc kubenswrapper[4689]: I0307 04:26:22.214397 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xc7w" event={"ID":"8ba6dceb-a52c-4108-af6e-ca861cdff2d9","Type":"ContainerDied","Data":"8840c22de54201f638395c24e332792421df9717c8f7872108127bdcd0a7c803"} Mar 07 04:26:22 crc kubenswrapper[4689]: I0307 04:26:22.214425 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xc7w" event={"ID":"8ba6dceb-a52c-4108-af6e-ca861cdff2d9","Type":"ContainerStarted","Data":"6283682a8844d53ff630134f82ce10f9578f4d9a64557fdcd940e855b2838c50"} Mar 07 04:26:22 crc kubenswrapper[4689]: I0307 04:26:22.238304 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8cclj" podStartSLOduration=2.784778754 podStartE2EDuration="5.238279086s" podCreationTimestamp="2026-03-07 04:26:17 +0000 UTC" firstStartedPulling="2026-03-07 04:26:19.167640453 +0000 UTC m=+424.214023982" lastFinishedPulling="2026-03-07 04:26:21.621140795 +0000 UTC m=+426.667524314" observedRunningTime="2026-03-07 04:26:22.232719685 +0000 UTC m=+427.279103214" watchObservedRunningTime="2026-03-07 04:26:22.238279086 +0000 UTC m=+427.284662615" Mar 07 04:26:22 crc kubenswrapper[4689]: I0307 04:26:22.257899 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4qkv8" podStartSLOduration=2.849749038 podStartE2EDuration="5.257873908s" podCreationTimestamp="2026-03-07 04:26:17 +0000 UTC" firstStartedPulling="2026-03-07 04:26:19.159876341 +0000 UTC m=+424.206259860" lastFinishedPulling="2026-03-07 04:26:21.568001201 +0000 UTC m=+426.614384730" observedRunningTime="2026-03-07 04:26:22.251913206 +0000 UTC m=+427.298296725" watchObservedRunningTime="2026-03-07 04:26:22.257873908 +0000 UTC m=+427.304257437" Mar 07 04:26:23 crc kubenswrapper[4689]: I0307 04:26:23.225612 4689 generic.go:334] "Generic (PLEG): container finished" podID="f58e77c1-4fe5-4b43-bd3c-babc094119f0" containerID="b797d4a66b2f260698476bd5d8d857fa0d7f3353175dfab193b45ac9304dc0b7" exitCode=0 Mar 07 04:26:23 crc kubenswrapper[4689]: I0307 04:26:23.225702 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6vbv" event={"ID":"f58e77c1-4fe5-4b43-bd3c-babc094119f0","Type":"ContainerDied","Data":"b797d4a66b2f260698476bd5d8d857fa0d7f3353175dfab193b45ac9304dc0b7"} Mar 07 04:26:23 crc kubenswrapper[4689]: I0307 04:26:23.227672 4689 generic.go:334] "Generic (PLEG): container finished" podID="8ba6dceb-a52c-4108-af6e-ca861cdff2d9" containerID="6283682a8844d53ff630134f82ce10f9578f4d9a64557fdcd940e855b2838c50" exitCode=0 Mar 07 04:26:23 crc kubenswrapper[4689]: I0307 04:26:23.227796 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xc7w" event={"ID":"8ba6dceb-a52c-4108-af6e-ca861cdff2d9","Type":"ContainerDied","Data":"6283682a8844d53ff630134f82ce10f9578f4d9a64557fdcd940e855b2838c50"} Mar 07 04:26:24 crc kubenswrapper[4689]: I0307 04:26:24.235651 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6vbv" event={"ID":"f58e77c1-4fe5-4b43-bd3c-babc094119f0","Type":"ContainerStarted","Data":"1342491380ce2af0fe41a3ecc185f4c48b4b54b8a028e7960e2cad77c8b0c5bc"} Mar 07 04:26:24 crc kubenswrapper[4689]: I0307 04:26:24.237924 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xc7w" event={"ID":"8ba6dceb-a52c-4108-af6e-ca861cdff2d9","Type":"ContainerStarted","Data":"0d81f8e1c420834122c516d855d90d245ef0aa582adbe131dcd74ec978444d73"} Mar 07 04:26:24 crc kubenswrapper[4689]: I0307 04:26:24.278931 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g6vbv" podStartSLOduration=2.811780279 podStartE2EDuration="4.278907548s" podCreationTimestamp="2026-03-07 04:26:20 +0000 UTC" firstStartedPulling="2026-03-07 04:26:22.210560372 +0000 UTC m=+427.256943861" lastFinishedPulling="2026-03-07 04:26:23.677687611 +0000 UTC m=+428.724071130" observedRunningTime="2026-03-07 04:26:24.254751922 +0000 UTC m=+429.301135461" watchObservedRunningTime="2026-03-07 04:26:24.278907548 +0000 UTC m=+429.325291047" Mar 07 04:26:28 crc kubenswrapper[4689]: I0307 04:26:28.033372 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:28 crc kubenswrapper[4689]: I0307 04:26:28.033573 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:28 crc kubenswrapper[4689]: I0307 04:26:28.238889 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:28 crc kubenswrapper[4689]: I0307 04:26:28.239288 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:28 crc kubenswrapper[4689]: I0307 04:26:28.300139 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:28 crc kubenswrapper[4689]: I0307 04:26:28.320306 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4xc7w" podStartSLOduration=5.711087676 podStartE2EDuration="8.320285771s" podCreationTimestamp="2026-03-07 04:26:20 +0000 UTC" firstStartedPulling="2026-03-07 04:26:21.201363057 +0000 UTC m=+426.247746556" lastFinishedPulling="2026-03-07 04:26:23.810561152 +0000 UTC m=+428.856944651" observedRunningTime="2026-03-07 04:26:24.276327588 +0000 UTC m=+429.322711107" watchObservedRunningTime="2026-03-07 04:26:28.320285771 +0000 UTC m=+433.366669280" Mar 07 04:26:28 crc kubenswrapper[4689]: I0307 04:26:28.356970 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4qkv8" Mar 07 04:26:29 crc kubenswrapper[4689]: I0307 04:26:29.109919 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8cclj" podUID="769da50c-a6db-491d-90d7-146ac186dad8" containerName="registry-server" probeResult="failure" output=< Mar 07 04:26:29 crc kubenswrapper[4689]: timeout: failed to connect service ":50051" within 1s Mar 07 04:26:29 crc kubenswrapper[4689]: > Mar 07 04:26:29 crc kubenswrapper[4689]: I0307 04:26:29.190029 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:26:29 crc kubenswrapper[4689]: I0307 04:26:29.190154 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:26:30 crc kubenswrapper[4689]: I0307 04:26:30.462475 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:30 crc kubenswrapper[4689]: I0307 04:26:30.462543 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:30 crc kubenswrapper[4689]: I0307 04:26:30.541144 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:30 crc kubenswrapper[4689]: I0307 04:26:30.642127 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:30 crc kubenswrapper[4689]: I0307 04:26:30.642201 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:30 crc kubenswrapper[4689]: I0307 04:26:30.700302 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:31 crc kubenswrapper[4689]: I0307 04:26:31.352685 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4xc7w" Mar 07 04:26:31 crc kubenswrapper[4689]: I0307 04:26:31.360934 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g6vbv" Mar 07 04:26:38 crc kubenswrapper[4689]: I0307 04:26:38.101792 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:38 crc kubenswrapper[4689]: I0307 04:26:38.175039 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8cclj" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.478583 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-td7nq"] Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.480299 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.503134 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-td7nq"] Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.622235 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q97vl\" (UniqueName: \"kubernetes.io/projected/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-kube-api-access-q97vl\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.622298 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.622331 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.622449 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.622502 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-registry-tls\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.622520 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-registry-certificates\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.622559 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-trusted-ca\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.622575 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-bound-sa-token\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.660496 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.723160 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.723242 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-registry-tls\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.723263 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-registry-certificates\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.723301 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-trusted-ca\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.723321 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-bound-sa-token\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.723342 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q97vl\" (UniqueName: \"kubernetes.io/projected/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-kube-api-access-q97vl\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.723360 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.723707 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.724919 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-trusted-ca\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.725509 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-registry-certificates\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.732879 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-registry-tls\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.734672 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.740071 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-bound-sa-token\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.743183 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q97vl\" (UniqueName: \"kubernetes.io/projected/a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc-kube-api-access-q97vl\") pod \"image-registry-66df7c8f76-td7nq\" (UID: \"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:43 crc kubenswrapper[4689]: I0307 04:26:43.799147 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:44 crc kubenswrapper[4689]: I0307 04:26:44.255690 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-td7nq"] Mar 07 04:26:44 crc kubenswrapper[4689]: W0307 04:26:44.267438 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f0ee56_9922_4e4b_8a80_56a5c6c5a0bc.slice/crio-11c9864b836e5a99ca54f863915a2f27d7312c92356d6d7d872f9a09f80f9bb3 WatchSource:0}: Error finding container 11c9864b836e5a99ca54f863915a2f27d7312c92356d6d7d872f9a09f80f9bb3: Status 404 returned error can't find the container with id 11c9864b836e5a99ca54f863915a2f27d7312c92356d6d7d872f9a09f80f9bb3 Mar 07 04:26:44 crc kubenswrapper[4689]: I0307 04:26:44.371703 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" event={"ID":"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc","Type":"ContainerStarted","Data":"11c9864b836e5a99ca54f863915a2f27d7312c92356d6d7d872f9a09f80f9bb3"} Mar 07 04:26:45 crc kubenswrapper[4689]: I0307 04:26:45.381604 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" event={"ID":"a7f0ee56-9922-4e4b-8a80-56a5c6c5a0bc","Type":"ContainerStarted","Data":"7085417dc88ab2e76f1d114b8fea0f2e55eb991c335cba67468c6085c22e99e7"} Mar 07 04:26:45 crc kubenswrapper[4689]: I0307 04:26:45.381972 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:26:45 crc kubenswrapper[4689]: I0307 04:26:45.418601 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" podStartSLOduration=2.418580199 podStartE2EDuration="2.418580199s" podCreationTimestamp="2026-03-07 04:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:26:45.415929667 +0000 UTC m=+450.462313186" watchObservedRunningTime="2026-03-07 04:26:45.418580199 +0000 UTC m=+450.464963708" Mar 07 04:26:59 crc kubenswrapper[4689]: I0307 04:26:59.190285 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:26:59 crc kubenswrapper[4689]: I0307 04:26:59.191129 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:27:03 crc kubenswrapper[4689]: I0307 04:27:03.807105 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-td7nq" Mar 07 04:27:03 crc kubenswrapper[4689]: I0307 04:27:03.886433 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4cbc9"] Mar 07 04:27:28 crc kubenswrapper[4689]: I0307 04:27:28.934702 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" podUID="60af193a-2553-4f45-b190-c86e1e3594e1" containerName="registry" containerID="cri-o://ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09" gracePeriod=30 Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.190745 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.190832 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.190894 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.191753 4689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23812cdb895a5f0e0a59b8a60c77194b6f8d32629f6b8cae7e8e7f3fc587e614"} pod="openshift-machine-config-operator/machine-config-daemon-dss5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.191881 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" containerID="cri-o://23812cdb895a5f0e0a59b8a60c77194b6f8d32629f6b8cae7e8e7f3fc587e614" gracePeriod=600 Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.369353 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.517626 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-bound-sa-token\") pod \"60af193a-2553-4f45-b190-c86e1e3594e1\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.517699 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60af193a-2553-4f45-b190-c86e1e3594e1-installation-pull-secrets\") pod \"60af193a-2553-4f45-b190-c86e1e3594e1\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.517772 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60af193a-2553-4f45-b190-c86e1e3594e1-ca-trust-extracted\") pod \"60af193a-2553-4f45-b190-c86e1e3594e1\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.517805 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9rs8\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-kube-api-access-h9rs8\") pod \"60af193a-2553-4f45-b190-c86e1e3594e1\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.517835 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-registry-tls\") pod \"60af193a-2553-4f45-b190-c86e1e3594e1\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.517860 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60af193a-2553-4f45-b190-c86e1e3594e1-trusted-ca\") pod \"60af193a-2553-4f45-b190-c86e1e3594e1\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.518067 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"60af193a-2553-4f45-b190-c86e1e3594e1\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.518098 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60af193a-2553-4f45-b190-c86e1e3594e1-registry-certificates\") pod \"60af193a-2553-4f45-b190-c86e1e3594e1\" (UID: \"60af193a-2553-4f45-b190-c86e1e3594e1\") " Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.519653 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60af193a-2553-4f45-b190-c86e1e3594e1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "60af193a-2553-4f45-b190-c86e1e3594e1" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.520284 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60af193a-2553-4f45-b190-c86e1e3594e1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "60af193a-2553-4f45-b190-c86e1e3594e1" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.525870 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "60af193a-2553-4f45-b190-c86e1e3594e1" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.526565 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "60af193a-2553-4f45-b190-c86e1e3594e1" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.526919 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-kube-api-access-h9rs8" (OuterVolumeSpecName: "kube-api-access-h9rs8") pod "60af193a-2553-4f45-b190-c86e1e3594e1" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1"). InnerVolumeSpecName "kube-api-access-h9rs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.527134 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60af193a-2553-4f45-b190-c86e1e3594e1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "60af193a-2553-4f45-b190-c86e1e3594e1" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.535093 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "60af193a-2553-4f45-b190-c86e1e3594e1" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.542251 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60af193a-2553-4f45-b190-c86e1e3594e1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "60af193a-2553-4f45-b190-c86e1e3594e1" (UID: "60af193a-2553-4f45-b190-c86e1e3594e1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.619752 4689 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60af193a-2553-4f45-b190-c86e1e3594e1-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.619789 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9rs8\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-kube-api-access-h9rs8\") on node \"crc\" DevicePath \"\"" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.619803 4689 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.619814 4689 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60af193a-2553-4f45-b190-c86e1e3594e1-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.619825 4689 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60af193a-2553-4f45-b190-c86e1e3594e1-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.619835 4689 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60af193a-2553-4f45-b190-c86e1e3594e1-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.619847 4689 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60af193a-2553-4f45-b190-c86e1e3594e1-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.688312 4689 generic.go:334] "Generic (PLEG): container finished" podID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerID="23812cdb895a5f0e0a59b8a60c77194b6f8d32629f6b8cae7e8e7f3fc587e614" exitCode=0 Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.688392 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerDied","Data":"23812cdb895a5f0e0a59b8a60c77194b6f8d32629f6b8cae7e8e7f3fc587e614"} Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.688454 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerStarted","Data":"b929a5d6764e60d2412d02d2d30426f108f4c1d195b3fd95c7435c02b959921b"} Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.688487 4689 scope.go:117] "RemoveContainer" containerID="75b084cd80a9cd340a1396e0937ec9c618e016e1383617f4dec2792051477d83" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.691733 4689 generic.go:334] "Generic (PLEG): container finished" podID="60af193a-2553-4f45-b190-c86e1e3594e1" containerID="ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09" exitCode=0 Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.691788 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" event={"ID":"60af193a-2553-4f45-b190-c86e1e3594e1","Type":"ContainerDied","Data":"ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09"} Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.691823 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" event={"ID":"60af193a-2553-4f45-b190-c86e1e3594e1","Type":"ContainerDied","Data":"2fae2282f70c1c463259dbfb54aa4e1f56dfacda6c7a9bd04e1536eb48388c16"} Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.692597 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4cbc9" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.733901 4689 scope.go:117] "RemoveContainer" containerID="ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.748392 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4cbc9"] Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.752312 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4cbc9"] Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.767284 4689 scope.go:117] "RemoveContainer" containerID="ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09" Mar 07 04:27:29 crc kubenswrapper[4689]: E0307 04:27:29.767804 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09\": container with ID starting with ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09 not found: ID does not exist" containerID="ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.767862 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09"} err="failed to get container status \"ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09\": rpc error: code = NotFound desc = could not find container \"ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09\": container with ID starting with ca205ab97df67a87ef25563ed50152470058299d0973d806bd000ab486bf0b09 not found: ID does not exist" Mar 07 04:27:29 crc kubenswrapper[4689]: I0307 04:27:29.842366 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60af193a-2553-4f45-b190-c86e1e3594e1" path="/var/lib/kubelet/pods/60af193a-2553-4f45-b190-c86e1e3594e1/volumes" Mar 07 04:28:00 crc kubenswrapper[4689]: I0307 04:28:00.144329 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547628-zf96k"] Mar 07 04:28:00 crc kubenswrapper[4689]: E0307 04:28:00.145137 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60af193a-2553-4f45-b190-c86e1e3594e1" containerName="registry" Mar 07 04:28:00 crc kubenswrapper[4689]: I0307 04:28:00.145154 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="60af193a-2553-4f45-b190-c86e1e3594e1" containerName="registry" Mar 07 04:28:00 crc kubenswrapper[4689]: I0307 04:28:00.145319 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="60af193a-2553-4f45-b190-c86e1e3594e1" containerName="registry" Mar 07 04:28:00 crc kubenswrapper[4689]: I0307 04:28:00.145903 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547628-zf96k" Mar 07 04:28:00 crc kubenswrapper[4689]: I0307 04:28:00.148842 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:28:00 crc kubenswrapper[4689]: I0307 04:28:00.149407 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:28:00 crc kubenswrapper[4689]: I0307 04:28:00.150900 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:28:00 crc kubenswrapper[4689]: I0307 04:28:00.164775 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547628-zf96k"] Mar 07 04:28:00 crc kubenswrapper[4689]: I0307 04:28:00.281201 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d4tv\" (UniqueName: \"kubernetes.io/projected/3c62f147-723a-420a-b75b-efe6a8585eb9-kube-api-access-4d4tv\") pod \"auto-csr-approver-29547628-zf96k\" (UID: \"3c62f147-723a-420a-b75b-efe6a8585eb9\") " pod="openshift-infra/auto-csr-approver-29547628-zf96k" Mar 07 04:28:00 crc kubenswrapper[4689]: I0307 04:28:00.382790 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d4tv\" (UniqueName: \"kubernetes.io/projected/3c62f147-723a-420a-b75b-efe6a8585eb9-kube-api-access-4d4tv\") pod \"auto-csr-approver-29547628-zf96k\" (UID: \"3c62f147-723a-420a-b75b-efe6a8585eb9\") " pod="openshift-infra/auto-csr-approver-29547628-zf96k" Mar 07 04:28:00 crc kubenswrapper[4689]: I0307 04:28:00.416420 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d4tv\" (UniqueName: \"kubernetes.io/projected/3c62f147-723a-420a-b75b-efe6a8585eb9-kube-api-access-4d4tv\") pod \"auto-csr-approver-29547628-zf96k\" (UID: \"3c62f147-723a-420a-b75b-efe6a8585eb9\") " pod="openshift-infra/auto-csr-approver-29547628-zf96k" Mar 07 04:28:00 crc kubenswrapper[4689]: I0307 04:28:00.494566 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547628-zf96k" Mar 07 04:28:01 crc kubenswrapper[4689]: I0307 04:28:01.000371 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547628-zf96k"] Mar 07 04:28:01 crc kubenswrapper[4689]: I0307 04:28:01.907611 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547628-zf96k" event={"ID":"3c62f147-723a-420a-b75b-efe6a8585eb9","Type":"ContainerStarted","Data":"f7e0c5186d6ac2ec610496c5177bcfd0f0d6451c48fc4cdbd540b3ad0e3a2306"} Mar 07 04:28:02 crc kubenswrapper[4689]: I0307 04:28:02.917851 4689 generic.go:334] "Generic (PLEG): container finished" podID="3c62f147-723a-420a-b75b-efe6a8585eb9" containerID="4b82357dc93d44463e296016d0781a5e7e0aa4b1e2f16303973938415ee63403" exitCode=0 Mar 07 04:28:02 crc kubenswrapper[4689]: I0307 04:28:02.917987 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547628-zf96k" event={"ID":"3c62f147-723a-420a-b75b-efe6a8585eb9","Type":"ContainerDied","Data":"4b82357dc93d44463e296016d0781a5e7e0aa4b1e2f16303973938415ee63403"} Mar 07 04:28:04 crc kubenswrapper[4689]: I0307 04:28:04.187720 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547628-zf96k" Mar 07 04:28:04 crc kubenswrapper[4689]: I0307 04:28:04.337783 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4tv\" (UniqueName: \"kubernetes.io/projected/3c62f147-723a-420a-b75b-efe6a8585eb9-kube-api-access-4d4tv\") pod \"3c62f147-723a-420a-b75b-efe6a8585eb9\" (UID: \"3c62f147-723a-420a-b75b-efe6a8585eb9\") " Mar 07 04:28:04 crc kubenswrapper[4689]: I0307 04:28:04.347423 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c62f147-723a-420a-b75b-efe6a8585eb9-kube-api-access-4d4tv" (OuterVolumeSpecName: "kube-api-access-4d4tv") pod "3c62f147-723a-420a-b75b-efe6a8585eb9" (UID: "3c62f147-723a-420a-b75b-efe6a8585eb9"). InnerVolumeSpecName "kube-api-access-4d4tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:28:04 crc kubenswrapper[4689]: I0307 04:28:04.439867 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4tv\" (UniqueName: \"kubernetes.io/projected/3c62f147-723a-420a-b75b-efe6a8585eb9-kube-api-access-4d4tv\") on node \"crc\" DevicePath \"\"" Mar 07 04:28:04 crc kubenswrapper[4689]: I0307 04:28:04.935127 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547628-zf96k" event={"ID":"3c62f147-723a-420a-b75b-efe6a8585eb9","Type":"ContainerDied","Data":"f7e0c5186d6ac2ec610496c5177bcfd0f0d6451c48fc4cdbd540b3ad0e3a2306"} Mar 07 04:28:04 crc kubenswrapper[4689]: I0307 04:28:04.935267 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7e0c5186d6ac2ec610496c5177bcfd0f0d6451c48fc4cdbd540b3ad0e3a2306" Mar 07 04:28:04 crc kubenswrapper[4689]: I0307 04:28:04.935263 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547628-zf96k" Mar 07 04:28:05 crc kubenswrapper[4689]: I0307 04:28:05.267006 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547622-4796h"] Mar 07 04:28:05 crc kubenswrapper[4689]: I0307 04:28:05.276769 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547622-4796h"] Mar 07 04:28:05 crc kubenswrapper[4689]: I0307 04:28:05.837929 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33a94bd2-f479-403b-9c36-a708410864aa" path="/var/lib/kubelet/pods/33a94bd2-f479-403b-9c36-a708410864aa/volumes" Mar 07 04:29:29 crc kubenswrapper[4689]: I0307 04:29:29.190399 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:29:29 crc kubenswrapper[4689]: I0307 04:29:29.192433 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:29:59 crc kubenswrapper[4689]: I0307 04:29:59.189814 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:29:59 crc kubenswrapper[4689]: I0307 04:29:59.190468 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.156758 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547630-wl6lv"] Mar 07 04:30:00 crc kubenswrapper[4689]: E0307 04:30:00.157635 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c62f147-723a-420a-b75b-efe6a8585eb9" containerName="oc" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.157681 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c62f147-723a-420a-b75b-efe6a8585eb9" containerName="oc" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.157905 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c62f147-723a-420a-b75b-efe6a8585eb9" containerName="oc" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.158742 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547630-wl6lv" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.161625 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.162110 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.162539 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.168295 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw"] Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.169693 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.172647 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.173339 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.180535 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547630-wl6lv"] Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.198006 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw"] Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.229868 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9rgq\" (UniqueName: \"kubernetes.io/projected/2dc541d1-e031-4e42-a304-66a08cb905b1-kube-api-access-v9rgq\") pod \"auto-csr-approver-29547630-wl6lv\" (UID: \"2dc541d1-e031-4e42-a304-66a08cb905b1\") " pod="openshift-infra/auto-csr-approver-29547630-wl6lv" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.331146 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2c7ab85-90e5-4569-b3e7-8150ed422271-config-volume\") pod \"collect-profiles-29547630-jpqlw\" (UID: \"b2c7ab85-90e5-4569-b3e7-8150ed422271\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.331332 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9rgq\" (UniqueName: \"kubernetes.io/projected/2dc541d1-e031-4e42-a304-66a08cb905b1-kube-api-access-v9rgq\") pod \"auto-csr-approver-29547630-wl6lv\" (UID: \"2dc541d1-e031-4e42-a304-66a08cb905b1\") " pod="openshift-infra/auto-csr-approver-29547630-wl6lv" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.331396 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr7lx\" (UniqueName: \"kubernetes.io/projected/b2c7ab85-90e5-4569-b3e7-8150ed422271-kube-api-access-dr7lx\") pod \"collect-profiles-29547630-jpqlw\" (UID: \"b2c7ab85-90e5-4569-b3e7-8150ed422271\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.331453 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2c7ab85-90e5-4569-b3e7-8150ed422271-secret-volume\") pod \"collect-profiles-29547630-jpqlw\" (UID: \"b2c7ab85-90e5-4569-b3e7-8150ed422271\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.364852 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9rgq\" (UniqueName: \"kubernetes.io/projected/2dc541d1-e031-4e42-a304-66a08cb905b1-kube-api-access-v9rgq\") pod \"auto-csr-approver-29547630-wl6lv\" (UID: \"2dc541d1-e031-4e42-a304-66a08cb905b1\") " pod="openshift-infra/auto-csr-approver-29547630-wl6lv" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.432587 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2c7ab85-90e5-4569-b3e7-8150ed422271-config-volume\") pod \"collect-profiles-29547630-jpqlw\" (UID: \"b2c7ab85-90e5-4569-b3e7-8150ed422271\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.432719 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr7lx\" (UniqueName: \"kubernetes.io/projected/b2c7ab85-90e5-4569-b3e7-8150ed422271-kube-api-access-dr7lx\") pod \"collect-profiles-29547630-jpqlw\" (UID: \"b2c7ab85-90e5-4569-b3e7-8150ed422271\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.432781 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2c7ab85-90e5-4569-b3e7-8150ed422271-secret-volume\") pod \"collect-profiles-29547630-jpqlw\" (UID: \"b2c7ab85-90e5-4569-b3e7-8150ed422271\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.434276 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2c7ab85-90e5-4569-b3e7-8150ed422271-config-volume\") pod \"collect-profiles-29547630-jpqlw\" (UID: \"b2c7ab85-90e5-4569-b3e7-8150ed422271\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.437881 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2c7ab85-90e5-4569-b3e7-8150ed422271-secret-volume\") pod \"collect-profiles-29547630-jpqlw\" (UID: \"b2c7ab85-90e5-4569-b3e7-8150ed422271\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.465773 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr7lx\" (UniqueName: \"kubernetes.io/projected/b2c7ab85-90e5-4569-b3e7-8150ed422271-kube-api-access-dr7lx\") pod \"collect-profiles-29547630-jpqlw\" (UID: \"b2c7ab85-90e5-4569-b3e7-8150ed422271\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.492827 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547630-wl6lv" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.505309 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.729713 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw"] Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.778242 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547630-wl6lv"] Mar 07 04:30:00 crc kubenswrapper[4689]: W0307 04:30:00.795259 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dc541d1_e031_4e42_a304_66a08cb905b1.slice/crio-6e14c09e25fb1210a40788dcaa1600e1240a72c91efb2a9e044b72fbff4d9241 WatchSource:0}: Error finding container 6e14c09e25fb1210a40788dcaa1600e1240a72c91efb2a9e044b72fbff4d9241: Status 404 returned error can't find the container with id 6e14c09e25fb1210a40788dcaa1600e1240a72c91efb2a9e044b72fbff4d9241 Mar 07 04:30:00 crc kubenswrapper[4689]: I0307 04:30:00.797421 4689 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 04:30:01 crc kubenswrapper[4689]: I0307 04:30:01.712725 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547630-wl6lv" event={"ID":"2dc541d1-e031-4e42-a304-66a08cb905b1","Type":"ContainerStarted","Data":"6e14c09e25fb1210a40788dcaa1600e1240a72c91efb2a9e044b72fbff4d9241"} Mar 07 04:30:01 crc kubenswrapper[4689]: I0307 04:30:01.715544 4689 generic.go:334] "Generic (PLEG): container finished" podID="b2c7ab85-90e5-4569-b3e7-8150ed422271" containerID="c7ccf951dbafedd03ea5020b49343eb990ec06f217e4a8dffb307d9e2f0af036" exitCode=0 Mar 07 04:30:01 crc kubenswrapper[4689]: I0307 04:30:01.715600 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" event={"ID":"b2c7ab85-90e5-4569-b3e7-8150ed422271","Type":"ContainerDied","Data":"c7ccf951dbafedd03ea5020b49343eb990ec06f217e4a8dffb307d9e2f0af036"} Mar 07 04:30:01 crc kubenswrapper[4689]: I0307 04:30:01.715649 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" event={"ID":"b2c7ab85-90e5-4569-b3e7-8150ed422271","Type":"ContainerStarted","Data":"fd3cdc46682a50dfa722545c558d3e5091969ac52351d7e5bc5ec5193f07f851"} Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.087671 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.169445 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2c7ab85-90e5-4569-b3e7-8150ed422271-secret-volume\") pod \"b2c7ab85-90e5-4569-b3e7-8150ed422271\" (UID: \"b2c7ab85-90e5-4569-b3e7-8150ed422271\") " Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.169591 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr7lx\" (UniqueName: \"kubernetes.io/projected/b2c7ab85-90e5-4569-b3e7-8150ed422271-kube-api-access-dr7lx\") pod \"b2c7ab85-90e5-4569-b3e7-8150ed422271\" (UID: \"b2c7ab85-90e5-4569-b3e7-8150ed422271\") " Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.169756 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2c7ab85-90e5-4569-b3e7-8150ed422271-config-volume\") pod \"b2c7ab85-90e5-4569-b3e7-8150ed422271\" (UID: \"b2c7ab85-90e5-4569-b3e7-8150ed422271\") " Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.170442 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2c7ab85-90e5-4569-b3e7-8150ed422271-config-volume" (OuterVolumeSpecName: "config-volume") pod "b2c7ab85-90e5-4569-b3e7-8150ed422271" (UID: "b2c7ab85-90e5-4569-b3e7-8150ed422271"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.175483 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c7ab85-90e5-4569-b3e7-8150ed422271-kube-api-access-dr7lx" (OuterVolumeSpecName: "kube-api-access-dr7lx") pod "b2c7ab85-90e5-4569-b3e7-8150ed422271" (UID: "b2c7ab85-90e5-4569-b3e7-8150ed422271"). InnerVolumeSpecName "kube-api-access-dr7lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.175619 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2c7ab85-90e5-4569-b3e7-8150ed422271-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b2c7ab85-90e5-4569-b3e7-8150ed422271" (UID: "b2c7ab85-90e5-4569-b3e7-8150ed422271"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.272118 4689 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2c7ab85-90e5-4569-b3e7-8150ed422271-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.272197 4689 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2c7ab85-90e5-4569-b3e7-8150ed422271-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.272218 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr7lx\" (UniqueName: \"kubernetes.io/projected/b2c7ab85-90e5-4569-b3e7-8150ed422271-kube-api-access-dr7lx\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.735097 4689 generic.go:334] "Generic (PLEG): container finished" podID="2dc541d1-e031-4e42-a304-66a08cb905b1" containerID="7c0b064a3b0f3ef5d50efe29dd1b27c172544ea401153f3747007d9d139a8a3c" exitCode=0 Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.735242 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547630-wl6lv" event={"ID":"2dc541d1-e031-4e42-a304-66a08cb905b1","Type":"ContainerDied","Data":"7c0b064a3b0f3ef5d50efe29dd1b27c172544ea401153f3747007d9d139a8a3c"} Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.739710 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" event={"ID":"b2c7ab85-90e5-4569-b3e7-8150ed422271","Type":"ContainerDied","Data":"fd3cdc46682a50dfa722545c558d3e5091969ac52351d7e5bc5ec5193f07f851"} Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.739958 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd3cdc46682a50dfa722545c558d3e5091969ac52351d7e5bc5ec5193f07f851" Mar 07 04:30:03 crc kubenswrapper[4689]: I0307 04:30:03.739794 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547630-jpqlw" Mar 07 04:30:04 crc kubenswrapper[4689]: E0307 04:30:04.945255 4689 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 07 04:30:05 crc kubenswrapper[4689]: I0307 04:30:05.098911 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547630-wl6lv" Mar 07 04:30:05 crc kubenswrapper[4689]: I0307 04:30:05.198192 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9rgq\" (UniqueName: \"kubernetes.io/projected/2dc541d1-e031-4e42-a304-66a08cb905b1-kube-api-access-v9rgq\") pod \"2dc541d1-e031-4e42-a304-66a08cb905b1\" (UID: \"2dc541d1-e031-4e42-a304-66a08cb905b1\") " Mar 07 04:30:05 crc kubenswrapper[4689]: I0307 04:30:05.205322 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc541d1-e031-4e42-a304-66a08cb905b1-kube-api-access-v9rgq" (OuterVolumeSpecName: "kube-api-access-v9rgq") pod "2dc541d1-e031-4e42-a304-66a08cb905b1" (UID: "2dc541d1-e031-4e42-a304-66a08cb905b1"). InnerVolumeSpecName "kube-api-access-v9rgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:30:05 crc kubenswrapper[4689]: I0307 04:30:05.300831 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9rgq\" (UniqueName: \"kubernetes.io/projected/2dc541d1-e031-4e42-a304-66a08cb905b1-kube-api-access-v9rgq\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:05 crc kubenswrapper[4689]: I0307 04:30:05.757915 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547630-wl6lv" event={"ID":"2dc541d1-e031-4e42-a304-66a08cb905b1","Type":"ContainerDied","Data":"6e14c09e25fb1210a40788dcaa1600e1240a72c91efb2a9e044b72fbff4d9241"} Mar 07 04:30:05 crc kubenswrapper[4689]: I0307 04:30:05.757959 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e14c09e25fb1210a40788dcaa1600e1240a72c91efb2a9e044b72fbff4d9241" Mar 07 04:30:05 crc kubenswrapper[4689]: I0307 04:30:05.758051 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547630-wl6lv" Mar 07 04:30:06 crc kubenswrapper[4689]: I0307 04:30:06.174922 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547624-v2j6p"] Mar 07 04:30:06 crc kubenswrapper[4689]: I0307 04:30:06.187805 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547624-v2j6p"] Mar 07 04:30:07 crc kubenswrapper[4689]: I0307 04:30:07.839858 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d98532b6-658d-41de-8e97-0f941ad34251" path="/var/lib/kubelet/pods/d98532b6-658d-41de-8e97-0f941ad34251/volumes" Mar 07 04:30:29 crc kubenswrapper[4689]: I0307 04:30:29.189942 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:30:29 crc kubenswrapper[4689]: I0307 04:30:29.190663 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:30:29 crc kubenswrapper[4689]: I0307 04:30:29.190715 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:30:29 crc kubenswrapper[4689]: I0307 04:30:29.191312 4689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b929a5d6764e60d2412d02d2d30426f108f4c1d195b3fd95c7435c02b959921b"} pod="openshift-machine-config-operator/machine-config-daemon-dss5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 04:30:29 crc kubenswrapper[4689]: I0307 04:30:29.191369 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" containerID="cri-o://b929a5d6764e60d2412d02d2d30426f108f4c1d195b3fd95c7435c02b959921b" gracePeriod=600 Mar 07 04:30:29 crc kubenswrapper[4689]: I0307 04:30:29.937003 4689 generic.go:334] "Generic (PLEG): container finished" podID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerID="b929a5d6764e60d2412d02d2d30426f108f4c1d195b3fd95c7435c02b959921b" exitCode=0 Mar 07 04:30:29 crc kubenswrapper[4689]: I0307 04:30:29.937055 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerDied","Data":"b929a5d6764e60d2412d02d2d30426f108f4c1d195b3fd95c7435c02b959921b"} Mar 07 04:30:29 crc kubenswrapper[4689]: I0307 04:30:29.937747 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerStarted","Data":"9c811faf449bec22216350a82fb0e4edb8efb6f32a1e999aafd915dabcad4588"} Mar 07 04:30:29 crc kubenswrapper[4689]: I0307 04:30:29.937770 4689 scope.go:117] "RemoveContainer" containerID="23812cdb895a5f0e0a59b8a60c77194b6f8d32629f6b8cae7e8e7f3fc587e614" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.457824 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j9bx5"] Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.459738 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovn-controller" containerID="cri-o://c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da" gracePeriod=30 Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.459976 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="kube-rbac-proxy-node" containerID="cri-o://f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f" gracePeriod=30 Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.459946 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="sbdb" containerID="cri-o://ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d" gracePeriod=30 Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.460024 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76" gracePeriod=30 Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.460095 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovn-acl-logging" containerID="cri-o://a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411" gracePeriod=30 Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.460307 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="northd" containerID="cri-o://e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606" gracePeriod=30 Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.460322 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="nbdb" containerID="cri-o://5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358" gracePeriod=30 Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.531351 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" containerID="cri-o://a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196" gracePeriod=30 Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.821076 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/3.log" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.824436 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovn-acl-logging/0.log" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.824999 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovn-controller/0.log" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.825802 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.902341 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6dqt8"] Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.903141 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="sbdb" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.903224 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="sbdb" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.903287 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.903337 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.903386 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.903430 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.903476 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="kube-rbac-proxy-ovn-metrics" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.903524 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="kube-rbac-proxy-ovn-metrics" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.903579 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="northd" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.903625 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="northd" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.903671 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc541d1-e031-4e42-a304-66a08cb905b1" containerName="oc" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.904465 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc541d1-e031-4e42-a304-66a08cb905b1" containerName="oc" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.904534 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovn-acl-logging" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.904584 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovn-acl-logging" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.904634 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="nbdb" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.904685 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="nbdb" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.904735 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovn-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.904782 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovn-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.904833 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="kubecfg-setup" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.904880 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="kubecfg-setup" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.904924 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.904971 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.905020 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="kube-rbac-proxy-node" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.905065 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="kube-rbac-proxy-node" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.905113 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.905397 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.905452 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c7ab85-90e5-4569-b3e7-8150ed422271" containerName="collect-profiles" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.905508 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c7ab85-90e5-4569-b3e7-8150ed422271" containerName="collect-profiles" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.905641 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.905692 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.905743 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="kube-rbac-proxy-ovn-metrics" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.905800 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.905848 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.905893 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="northd" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.905941 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovn-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.905988 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc541d1-e031-4e42-a304-66a08cb905b1" containerName="oc" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.906036 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="sbdb" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.906087 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovn-acl-logging" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.906137 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="kube-rbac-proxy-node" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.906201 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="nbdb" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.906256 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c7ab85-90e5-4569-b3e7-8150ed422271" containerName="collect-profiles" Mar 07 04:30:37 crc kubenswrapper[4689]: E0307 04:30:37.906390 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.906442 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.906587 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerName="ovnkube-controller" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.908154 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984272 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-log-socket\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984334 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-openvswitch\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984381 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-env-overrides\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984428 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-systemd\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984421 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-log-socket" (OuterVolumeSpecName: "log-socket") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984473 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-var-lib-openvswitch\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984475 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984503 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-slash\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984547 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2hpp\" (UniqueName: \"kubernetes.io/projected/ee6653df-cf05-46a7-9187-97bfc3c5b849-kube-api-access-w2hpp\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984577 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-kubelet\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984611 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovnkube-script-lib\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984640 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-etc-openvswitch\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984677 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovn-node-metrics-cert\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984720 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-cni-bin\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984750 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovnkube-config\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984782 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984810 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-ovn\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984839 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-cni-netd\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984865 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-node-log\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984949 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-run-ovn-kubernetes\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.984990 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-systemd-units\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985020 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-run-netns\") pod \"ee6653df-cf05-46a7-9187-97bfc3c5b849\" (UID: \"ee6653df-cf05-46a7-9187-97bfc3c5b849\") " Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985049 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985095 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985127 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985197 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985590 4689 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985616 4689 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985633 4689 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-log-socket\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985650 4689 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985666 4689 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985682 4689 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985676 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985733 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985760 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985785 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-node-log" (OuterVolumeSpecName: "node-log") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985791 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985819 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.985843 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.986629 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-slash" (OuterVolumeSpecName: "host-slash") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.987865 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.987914 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.988676 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.995371 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:30:37 crc kubenswrapper[4689]: I0307 04:30:37.999045 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee6653df-cf05-46a7-9187-97bfc3c5b849-kube-api-access-w2hpp" (OuterVolumeSpecName: "kube-api-access-w2hpp") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "kube-api-access-w2hpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.004341 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovnkube-controller/3.log" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.007916 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovn-acl-logging/0.log" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.007931 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ee6653df-cf05-46a7-9187-97bfc3c5b849" (UID: "ee6653df-cf05-46a7-9187-97bfc3c5b849"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.010582 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j9bx5_ee6653df-cf05-46a7-9187-97bfc3c5b849/ovn-controller/0.log" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011069 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196" exitCode=0 Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011094 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d" exitCode=0 Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011102 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358" exitCode=0 Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011109 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606" exitCode=0 Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011117 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76" exitCode=0 Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011126 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f" exitCode=0 Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011133 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411" exitCode=143 Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011140 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee6653df-cf05-46a7-9187-97bfc3c5b849" containerID="c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da" exitCode=143 Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011135 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011213 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011226 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011236 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011245 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011277 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011290 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011300 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011256 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011307 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011426 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011438 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011445 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011450 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011456 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011461 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011477 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011273 4689 scope.go:117] "RemoveContainer" containerID="a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011495 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011567 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011572 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011577 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011583 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011587 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011592 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011597 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011602 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011609 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011617 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011625 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011632 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011637 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011643 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011649 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011655 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011661 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011667 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011673 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011679 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011687 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j9bx5" event={"ID":"ee6653df-cf05-46a7-9187-97bfc3c5b849","Type":"ContainerDied","Data":"3408db5fde5a5ed80ae1d3f2519603aa7e40a80a9d55203bf2a14ff02fcb4159"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011695 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011702 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011708 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011713 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011719 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011724 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011729 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011734 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011739 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.011744 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.013391 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wmhqx_5508b217-e634-41a8-813a-65ae39d7ea3d/kube-multus/2.log" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.013895 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wmhqx_5508b217-e634-41a8-813a-65ae39d7ea3d/kube-multus/1.log" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.013929 4689 generic.go:334] "Generic (PLEG): container finished" podID="5508b217-e634-41a8-813a-65ae39d7ea3d" containerID="893297981dd6ce3d3fbe960d1e5b7c6adc5bb2f18dcfd916b37cf25761cff3d9" exitCode=2 Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.013970 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wmhqx" event={"ID":"5508b217-e634-41a8-813a-65ae39d7ea3d","Type":"ContainerDied","Data":"893297981dd6ce3d3fbe960d1e5b7c6adc5bb2f18dcfd916b37cf25761cff3d9"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.013984 4689 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5"} Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.014664 4689 scope.go:117] "RemoveContainer" containerID="893297981dd6ce3d3fbe960d1e5b7c6adc5bb2f18dcfd916b37cf25761cff3d9" Mar 07 04:30:38 crc kubenswrapper[4689]: E0307 04:30:38.014880 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wmhqx_openshift-multus(5508b217-e634-41a8-813a-65ae39d7ea3d)\"" pod="openshift-multus/multus-wmhqx" podUID="5508b217-e634-41a8-813a-65ae39d7ea3d" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.046078 4689 scope.go:117] "RemoveContainer" containerID="3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.063582 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j9bx5"] Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.068711 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j9bx5"] Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.083063 4689 scope.go:117] "RemoveContainer" containerID="ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087086 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-kubelet\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087120 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-run-ovn-kubernetes\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087138 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-run-openvswitch\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087153 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-log-socket\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087325 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-cni-netd\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087348 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/92ac7a46-152d-4727-8ba3-1f4d0cca9290-env-overrides\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087370 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-systemd-units\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087384 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-run-systemd\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087399 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/92ac7a46-152d-4727-8ba3-1f4d0cca9290-ovn-node-metrics-cert\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087421 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-run-ovn\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087436 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-cni-bin\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087455 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-slash\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087472 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-etc-openvswitch\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087485 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd6sv\" (UniqueName: \"kubernetes.io/projected/92ac7a46-152d-4727-8ba3-1f4d0cca9290-kube-api-access-kd6sv\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087508 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/92ac7a46-152d-4727-8ba3-1f4d0cca9290-ovnkube-script-lib\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087523 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087545 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-run-netns\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087558 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-node-log\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087572 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-var-lib-openvswitch\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087586 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/92ac7a46-152d-4727-8ba3-1f4d0cca9290-ovnkube-config\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087623 4689 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087632 4689 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087641 4689 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-slash\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087649 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2hpp\" (UniqueName: \"kubernetes.io/projected/ee6653df-cf05-46a7-9187-97bfc3c5b849-kube-api-access-w2hpp\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087658 4689 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087668 4689 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087677 4689 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087686 4689 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087694 4689 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee6653df-cf05-46a7-9187-97bfc3c5b849-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087702 4689 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087711 4689 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087720 4689 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087729 4689 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-node-log\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.087738 4689 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee6653df-cf05-46a7-9187-97bfc3c5b849-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.100539 4689 scope.go:117] "RemoveContainer" containerID="5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.116211 4689 scope.go:117] "RemoveContainer" containerID="e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.130513 4689 scope.go:117] "RemoveContainer" containerID="c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.143645 4689 scope.go:117] "RemoveContainer" containerID="f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.160583 4689 scope.go:117] "RemoveContainer" containerID="a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.173048 4689 scope.go:117] "RemoveContainer" containerID="c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.186383 4689 scope.go:117] "RemoveContainer" containerID="10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189139 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-systemd-units\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189194 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-run-systemd\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189220 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/92ac7a46-152d-4727-8ba3-1f4d0cca9290-ovn-node-metrics-cert\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189241 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-systemd-units\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189253 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-run-ovn\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189290 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-run-ovn\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189294 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-cni-bin\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189312 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-run-systemd\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189331 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-slash\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189362 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-slash\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189322 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-cni-bin\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189398 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-etc-openvswitch\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189437 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd6sv\" (UniqueName: \"kubernetes.io/projected/92ac7a46-152d-4727-8ba3-1f4d0cca9290-kube-api-access-kd6sv\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189489 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-etc-openvswitch\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189505 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/92ac7a46-152d-4727-8ba3-1f4d0cca9290-ovnkube-script-lib\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189538 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189578 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-run-netns\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189596 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-node-log\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189619 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-var-lib-openvswitch\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189643 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/92ac7a46-152d-4727-8ba3-1f4d0cca9290-ovnkube-config\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189726 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-kubelet\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189752 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-node-log\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189941 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-run-ovn-kubernetes\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.189977 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-run-openvswitch\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190001 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-log-socket\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190022 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/92ac7a46-152d-4727-8ba3-1f4d0cca9290-env-overrides\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190048 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-cni-netd\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190166 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-cni-netd\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190191 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190232 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-var-lib-openvswitch\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190233 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-run-ovn-kubernetes\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190202 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-kubelet\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190271 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-run-openvswitch\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190306 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-log-socket\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190746 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92ac7a46-152d-4727-8ba3-1f4d0cca9290-host-run-netns\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190793 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/92ac7a46-152d-4727-8ba3-1f4d0cca9290-ovnkube-script-lib\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.190939 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/92ac7a46-152d-4727-8ba3-1f4d0cca9290-env-overrides\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.191128 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/92ac7a46-152d-4727-8ba3-1f4d0cca9290-ovnkube-config\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.193990 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/92ac7a46-152d-4727-8ba3-1f4d0cca9290-ovn-node-metrics-cert\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.204636 4689 scope.go:117] "RemoveContainer" containerID="a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196" Mar 07 04:30:38 crc kubenswrapper[4689]: E0307 04:30:38.205198 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196\": container with ID starting with a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196 not found: ID does not exist" containerID="a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.205231 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196"} err="failed to get container status \"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196\": rpc error: code = NotFound desc = could not find container \"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196\": container with ID starting with a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.205253 4689 scope.go:117] "RemoveContainer" containerID="3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a" Mar 07 04:30:38 crc kubenswrapper[4689]: E0307 04:30:38.205509 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\": container with ID starting with 3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a not found: ID does not exist" containerID="3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.205545 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a"} err="failed to get container status \"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\": rpc error: code = NotFound desc = could not find container \"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\": container with ID starting with 3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.205568 4689 scope.go:117] "RemoveContainer" containerID="ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d" Mar 07 04:30:38 crc kubenswrapper[4689]: E0307 04:30:38.205885 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\": container with ID starting with ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d not found: ID does not exist" containerID="ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.205910 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d"} err="failed to get container status \"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\": rpc error: code = NotFound desc = could not find container \"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\": container with ID starting with ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.205925 4689 scope.go:117] "RemoveContainer" containerID="5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358" Mar 07 04:30:38 crc kubenswrapper[4689]: E0307 04:30:38.206278 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\": container with ID starting with 5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358 not found: ID does not exist" containerID="5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.206306 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358"} err="failed to get container status \"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\": rpc error: code = NotFound desc = could not find container \"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\": container with ID starting with 5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.206321 4689 scope.go:117] "RemoveContainer" containerID="e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606" Mar 07 04:30:38 crc kubenswrapper[4689]: E0307 04:30:38.206676 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\": container with ID starting with e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606 not found: ID does not exist" containerID="e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.206794 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606"} err="failed to get container status \"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\": rpc error: code = NotFound desc = could not find container \"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\": container with ID starting with e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.206879 4689 scope.go:117] "RemoveContainer" containerID="c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76" Mar 07 04:30:38 crc kubenswrapper[4689]: E0307 04:30:38.207233 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\": container with ID starting with c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76 not found: ID does not exist" containerID="c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.207258 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76"} err="failed to get container status \"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\": rpc error: code = NotFound desc = could not find container \"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\": container with ID starting with c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.207274 4689 scope.go:117] "RemoveContainer" containerID="f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f" Mar 07 04:30:38 crc kubenswrapper[4689]: E0307 04:30:38.207512 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\": container with ID starting with f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f not found: ID does not exist" containerID="f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.207551 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f"} err="failed to get container status \"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\": rpc error: code = NotFound desc = could not find container \"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\": container with ID starting with f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.207583 4689 scope.go:117] "RemoveContainer" containerID="a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411" Mar 07 04:30:38 crc kubenswrapper[4689]: E0307 04:30:38.207947 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\": container with ID starting with a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411 not found: ID does not exist" containerID="a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.208370 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411"} err="failed to get container status \"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\": rpc error: code = NotFound desc = could not find container \"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\": container with ID starting with a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.208498 4689 scope.go:117] "RemoveContainer" containerID="c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da" Mar 07 04:30:38 crc kubenswrapper[4689]: E0307 04:30:38.210280 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\": container with ID starting with c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da not found: ID does not exist" containerID="c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.210306 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da"} err="failed to get container status \"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\": rpc error: code = NotFound desc = could not find container \"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\": container with ID starting with c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.210319 4689 scope.go:117] "RemoveContainer" containerID="10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816" Mar 07 04:30:38 crc kubenswrapper[4689]: E0307 04:30:38.210568 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\": container with ID starting with 10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816 not found: ID does not exist" containerID="10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.210596 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816"} err="failed to get container status \"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\": rpc error: code = NotFound desc = could not find container \"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\": container with ID starting with 10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.210613 4689 scope.go:117] "RemoveContainer" containerID="a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.211257 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd6sv\" (UniqueName: \"kubernetes.io/projected/92ac7a46-152d-4727-8ba3-1f4d0cca9290-kube-api-access-kd6sv\") pod \"ovnkube-node-6dqt8\" (UID: \"92ac7a46-152d-4727-8ba3-1f4d0cca9290\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.211508 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196"} err="failed to get container status \"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196\": rpc error: code = NotFound desc = could not find container \"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196\": container with ID starting with a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.211591 4689 scope.go:117] "RemoveContainer" containerID="3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.211875 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a"} err="failed to get container status \"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\": rpc error: code = NotFound desc = could not find container \"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\": container with ID starting with 3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.211896 4689 scope.go:117] "RemoveContainer" containerID="ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.212092 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d"} err="failed to get container status \"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\": rpc error: code = NotFound desc = could not find container \"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\": container with ID starting with ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.212116 4689 scope.go:117] "RemoveContainer" containerID="5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.212327 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358"} err="failed to get container status \"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\": rpc error: code = NotFound desc = could not find container \"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\": container with ID starting with 5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.212349 4689 scope.go:117] "RemoveContainer" containerID="e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.212542 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606"} err="failed to get container status \"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\": rpc error: code = NotFound desc = could not find container \"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\": container with ID starting with e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.212565 4689 scope.go:117] "RemoveContainer" containerID="c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.212804 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76"} err="failed to get container status \"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\": rpc error: code = NotFound desc = could not find container \"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\": container with ID starting with c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.212848 4689 scope.go:117] "RemoveContainer" containerID="f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.213051 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f"} err="failed to get container status \"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\": rpc error: code = NotFound desc = could not find container \"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\": container with ID starting with f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.213070 4689 scope.go:117] "RemoveContainer" containerID="a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.213274 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411"} err="failed to get container status \"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\": rpc error: code = NotFound desc = could not find container \"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\": container with ID starting with a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.213295 4689 scope.go:117] "RemoveContainer" containerID="c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.213488 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da"} err="failed to get container status \"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\": rpc error: code = NotFound desc = could not find container \"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\": container with ID starting with c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.213505 4689 scope.go:117] "RemoveContainer" containerID="10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.213702 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816"} err="failed to get container status \"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\": rpc error: code = NotFound desc = could not find container \"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\": container with ID starting with 10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.213736 4689 scope.go:117] "RemoveContainer" containerID="a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.213974 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196"} err="failed to get container status \"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196\": rpc error: code = NotFound desc = could not find container \"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196\": container with ID starting with a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.214027 4689 scope.go:117] "RemoveContainer" containerID="3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.214325 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a"} err="failed to get container status \"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\": rpc error: code = NotFound desc = could not find container \"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\": container with ID starting with 3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.214351 4689 scope.go:117] "RemoveContainer" containerID="ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.214636 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d"} err="failed to get container status \"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\": rpc error: code = NotFound desc = could not find container \"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\": container with ID starting with ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.214690 4689 scope.go:117] "RemoveContainer" containerID="5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.214882 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358"} err="failed to get container status \"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\": rpc error: code = NotFound desc = could not find container \"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\": container with ID starting with 5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.214922 4689 scope.go:117] "RemoveContainer" containerID="e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.215166 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606"} err="failed to get container status \"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\": rpc error: code = NotFound desc = could not find container \"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\": container with ID starting with e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.215208 4689 scope.go:117] "RemoveContainer" containerID="c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.215375 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76"} err="failed to get container status \"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\": rpc error: code = NotFound desc = could not find container \"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\": container with ID starting with c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.215397 4689 scope.go:117] "RemoveContainer" containerID="f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.215549 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f"} err="failed to get container status \"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\": rpc error: code = NotFound desc = could not find container \"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\": container with ID starting with f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.215566 4689 scope.go:117] "RemoveContainer" containerID="a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.215718 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411"} err="failed to get container status \"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\": rpc error: code = NotFound desc = could not find container \"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\": container with ID starting with a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.215736 4689 scope.go:117] "RemoveContainer" containerID="c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.215894 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da"} err="failed to get container status \"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\": rpc error: code = NotFound desc = could not find container \"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\": container with ID starting with c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.215918 4689 scope.go:117] "RemoveContainer" containerID="10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.216133 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816"} err="failed to get container status \"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\": rpc error: code = NotFound desc = could not find container \"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\": container with ID starting with 10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.216151 4689 scope.go:117] "RemoveContainer" containerID="a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.216655 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196"} err="failed to get container status \"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196\": rpc error: code = NotFound desc = could not find container \"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196\": container with ID starting with a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.216702 4689 scope.go:117] "RemoveContainer" containerID="3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.216958 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a"} err="failed to get container status \"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\": rpc error: code = NotFound desc = could not find container \"3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a\": container with ID starting with 3983caa520e094edc2249182332d6210179a2a1d99c51e55424b336b15e8e55a not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.216980 4689 scope.go:117] "RemoveContainer" containerID="ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.217199 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d"} err="failed to get container status \"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\": rpc error: code = NotFound desc = could not find container \"ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d\": container with ID starting with ba1740dfdfedd6803c9ab03fbb6b5dd314378653e4f0f8b7ed8b379e2d74e11d not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.217220 4689 scope.go:117] "RemoveContainer" containerID="5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.217428 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358"} err="failed to get container status \"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\": rpc error: code = NotFound desc = could not find container \"5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358\": container with ID starting with 5e094be999c96edccd73957ef007db0104760a8b619cc12671eeea89334fb358 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.217449 4689 scope.go:117] "RemoveContainer" containerID="e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.217836 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606"} err="failed to get container status \"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\": rpc error: code = NotFound desc = could not find container \"e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606\": container with ID starting with e5a97d218800e4cece125cd977e4507320609062d905b751c656edadcd3d0606 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.217859 4689 scope.go:117] "RemoveContainer" containerID="c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.218090 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76"} err="failed to get container status \"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\": rpc error: code = NotFound desc = could not find container \"c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76\": container with ID starting with c64a9968181747cf269b50b164ce9b7e89eee376b5b5dff70a23d54f974ebe76 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.218117 4689 scope.go:117] "RemoveContainer" containerID="f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.218390 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f"} err="failed to get container status \"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\": rpc error: code = NotFound desc = could not find container \"f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f\": container with ID starting with f04bb30b2428d2af7102c116c25667b8ced802d68a46363e83b9f3419ffbdc8f not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.218420 4689 scope.go:117] "RemoveContainer" containerID="a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.218636 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411"} err="failed to get container status \"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\": rpc error: code = NotFound desc = could not find container \"a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411\": container with ID starting with a86810f6a8797829e0dca9f0462521e6f83f45e4aabf841a153e9a311081d411 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.218660 4689 scope.go:117] "RemoveContainer" containerID="c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.218871 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da"} err="failed to get container status \"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\": rpc error: code = NotFound desc = could not find container \"c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da\": container with ID starting with c9c764ec8b8efba589024417e4313b215c87ee3683b1cf6c5d6c7625389e83da not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.218893 4689 scope.go:117] "RemoveContainer" containerID="10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.219118 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816"} err="failed to get container status \"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\": rpc error: code = NotFound desc = could not find container \"10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816\": container with ID starting with 10b7b4a6d118d5addda481fd70b2f7a49be3068b4a3078c63437b021c4fda816 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.219142 4689 scope.go:117] "RemoveContainer" containerID="a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.219419 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196"} err="failed to get container status \"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196\": rpc error: code = NotFound desc = could not find container \"a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196\": container with ID starting with a021f120e24f3f4ebaa4257d48744f36ef143aadc3232f5805fb347657993196 not found: ID does not exist" Mar 07 04:30:38 crc kubenswrapper[4689]: I0307 04:30:38.235115 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:39 crc kubenswrapper[4689]: I0307 04:30:39.022472 4689 generic.go:334] "Generic (PLEG): container finished" podID="92ac7a46-152d-4727-8ba3-1f4d0cca9290" containerID="b0fc7827884da619fb7d8a61d141298287c1d711204e4030be6426deef39197c" exitCode=0 Mar 07 04:30:39 crc kubenswrapper[4689]: I0307 04:30:39.022572 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" event={"ID":"92ac7a46-152d-4727-8ba3-1f4d0cca9290","Type":"ContainerDied","Data":"b0fc7827884da619fb7d8a61d141298287c1d711204e4030be6426deef39197c"} Mar 07 04:30:39 crc kubenswrapper[4689]: I0307 04:30:39.022612 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" event={"ID":"92ac7a46-152d-4727-8ba3-1f4d0cca9290","Type":"ContainerStarted","Data":"7d214ef2361708a15f184dafe37026c5edf5cd30cdb00adb128927674d86b73c"} Mar 07 04:30:39 crc kubenswrapper[4689]: I0307 04:30:39.833528 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee6653df-cf05-46a7-9187-97bfc3c5b849" path="/var/lib/kubelet/pods/ee6653df-cf05-46a7-9187-97bfc3c5b849/volumes" Mar 07 04:30:40 crc kubenswrapper[4689]: I0307 04:30:40.036253 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" event={"ID":"92ac7a46-152d-4727-8ba3-1f4d0cca9290","Type":"ContainerStarted","Data":"8bfbce55d4700574e19a00da7336d1983f2384564c9dcbb6d7cc3d254160415e"} Mar 07 04:30:40 crc kubenswrapper[4689]: I0307 04:30:40.036296 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" event={"ID":"92ac7a46-152d-4727-8ba3-1f4d0cca9290","Type":"ContainerStarted","Data":"282b5f75ac0831b7a296c23500c72b7f4b2513d06a31d136b24d186762c6a7ed"} Mar 07 04:30:40 crc kubenswrapper[4689]: I0307 04:30:40.036311 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" event={"ID":"92ac7a46-152d-4727-8ba3-1f4d0cca9290","Type":"ContainerStarted","Data":"c8c9a2c873164ace60fff34868d83e5d47eecd5c12c823c74a1ef0b605f5d475"} Mar 07 04:30:40 crc kubenswrapper[4689]: I0307 04:30:40.036325 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" event={"ID":"92ac7a46-152d-4727-8ba3-1f4d0cca9290","Type":"ContainerStarted","Data":"2d193f2812cf10abd87fa51a3cefa60f1349d59c4f00b11faef7e23d34d7c530"} Mar 07 04:30:40 crc kubenswrapper[4689]: I0307 04:30:40.036337 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" event={"ID":"92ac7a46-152d-4727-8ba3-1f4d0cca9290","Type":"ContainerStarted","Data":"7a84405f05b97581fe8dc65fac8757208df6878e1a8d0e506876b8c51344df65"} Mar 07 04:30:40 crc kubenswrapper[4689]: I0307 04:30:40.036348 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" event={"ID":"92ac7a46-152d-4727-8ba3-1f4d0cca9290","Type":"ContainerStarted","Data":"2fb9eaf98107e12665cf6ee5be1547cb4a641ce4334eaf5df3d271cc6f05bbe6"} Mar 07 04:30:43 crc kubenswrapper[4689]: I0307 04:30:43.063761 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" event={"ID":"92ac7a46-152d-4727-8ba3-1f4d0cca9290","Type":"ContainerStarted","Data":"a1ccebd5ca521f58caef43c1767b4e43a2aa58e6a9d605b4c0018528f17ace6c"} Mar 07 04:30:45 crc kubenswrapper[4689]: I0307 04:30:45.079134 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" event={"ID":"92ac7a46-152d-4727-8ba3-1f4d0cca9290","Type":"ContainerStarted","Data":"75b08e2794024e95ce88010b61cb38bffb8b2273e836e7def398a83af732a5f4"} Mar 07 04:30:45 crc kubenswrapper[4689]: I0307 04:30:45.079695 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:45 crc kubenswrapper[4689]: I0307 04:30:45.119383 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:45 crc kubenswrapper[4689]: I0307 04:30:45.126569 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" podStartSLOduration=8.12655031 podStartE2EDuration="8.12655031s" podCreationTimestamp="2026-03-07 04:30:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:30:45.120857005 +0000 UTC m=+690.167240514" watchObservedRunningTime="2026-03-07 04:30:45.12655031 +0000 UTC m=+690.172933799" Mar 07 04:30:46 crc kubenswrapper[4689]: I0307 04:30:46.085411 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:46 crc kubenswrapper[4689]: I0307 04:30:46.085492 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:46 crc kubenswrapper[4689]: I0307 04:30:46.119735 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:30:49 crc kubenswrapper[4689]: I0307 04:30:49.123958 4689 scope.go:117] "RemoveContainer" containerID="917935269d5fc94cd233e15f2825a1cb7041f968721bc786a2a97c66eb8a5338" Mar 07 04:30:49 crc kubenswrapper[4689]: I0307 04:30:49.172963 4689 scope.go:117] "RemoveContainer" containerID="f97a9da58e3b528f5f004a14e794d6ed3b0c80c34b9269631df208850715d7f2" Mar 07 04:30:49 crc kubenswrapper[4689]: I0307 04:30:49.219080 4689 scope.go:117] "RemoveContainer" containerID="4e146dc08141e9be9108cb1f340c11ee0180591f9fd4fe6c8c2e47acbb0602a5" Mar 07 04:30:49 crc kubenswrapper[4689]: I0307 04:30:49.826715 4689 scope.go:117] "RemoveContainer" containerID="893297981dd6ce3d3fbe960d1e5b7c6adc5bb2f18dcfd916b37cf25761cff3d9" Mar 07 04:30:49 crc kubenswrapper[4689]: E0307 04:30:49.827058 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wmhqx_openshift-multus(5508b217-e634-41a8-813a-65ae39d7ea3d)\"" pod="openshift-multus/multus-wmhqx" podUID="5508b217-e634-41a8-813a-65ae39d7ea3d" Mar 07 04:30:50 crc kubenswrapper[4689]: I0307 04:30:50.114539 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wmhqx_5508b217-e634-41a8-813a-65ae39d7ea3d/kube-multus/2.log" Mar 07 04:31:04 crc kubenswrapper[4689]: I0307 04:31:04.826484 4689 scope.go:117] "RemoveContainer" containerID="893297981dd6ce3d3fbe960d1e5b7c6adc5bb2f18dcfd916b37cf25761cff3d9" Mar 07 04:31:05 crc kubenswrapper[4689]: I0307 04:31:05.234045 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wmhqx_5508b217-e634-41a8-813a-65ae39d7ea3d/kube-multus/2.log" Mar 07 04:31:05 crc kubenswrapper[4689]: I0307 04:31:05.234421 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wmhqx" event={"ID":"5508b217-e634-41a8-813a-65ae39d7ea3d","Type":"ContainerStarted","Data":"5803989e3bd04392449ff522d6ed3cfba4f750e991cfe712b7a981d3aa08bf6a"} Mar 07 04:31:08 crc kubenswrapper[4689]: I0307 04:31:08.272006 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6dqt8" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.138193 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df"] Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.139901 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.143713 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.152156 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df"] Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.253910 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhclr\" (UniqueName: \"kubernetes.io/projected/a69229a6-7e04-4039-b08e-09cef56b36ba-kube-api-access-dhclr\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df\" (UID: \"a69229a6-7e04-4039-b08e-09cef56b36ba\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.254428 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a69229a6-7e04-4039-b08e-09cef56b36ba-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df\" (UID: \"a69229a6-7e04-4039-b08e-09cef56b36ba\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.254722 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a69229a6-7e04-4039-b08e-09cef56b36ba-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df\" (UID: \"a69229a6-7e04-4039-b08e-09cef56b36ba\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.356157 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhclr\" (UniqueName: \"kubernetes.io/projected/a69229a6-7e04-4039-b08e-09cef56b36ba-kube-api-access-dhclr\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df\" (UID: \"a69229a6-7e04-4039-b08e-09cef56b36ba\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.356280 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a69229a6-7e04-4039-b08e-09cef56b36ba-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df\" (UID: \"a69229a6-7e04-4039-b08e-09cef56b36ba\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.356306 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a69229a6-7e04-4039-b08e-09cef56b36ba-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df\" (UID: \"a69229a6-7e04-4039-b08e-09cef56b36ba\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.356839 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a69229a6-7e04-4039-b08e-09cef56b36ba-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df\" (UID: \"a69229a6-7e04-4039-b08e-09cef56b36ba\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.357134 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a69229a6-7e04-4039-b08e-09cef56b36ba-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df\" (UID: \"a69229a6-7e04-4039-b08e-09cef56b36ba\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.384929 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhclr\" (UniqueName: \"kubernetes.io/projected/a69229a6-7e04-4039-b08e-09cef56b36ba-kube-api-access-dhclr\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df\" (UID: \"a69229a6-7e04-4039-b08e-09cef56b36ba\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.459850 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:12 crc kubenswrapper[4689]: I0307 04:31:12.735633 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df"] Mar 07 04:31:13 crc kubenswrapper[4689]: I0307 04:31:13.284109 4689 generic.go:334] "Generic (PLEG): container finished" podID="a69229a6-7e04-4039-b08e-09cef56b36ba" containerID="74f978eab1023cde812b23268f6896f693a51043ededc6b8f7ba29ec733e6a9a" exitCode=0 Mar 07 04:31:13 crc kubenswrapper[4689]: I0307 04:31:13.284144 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" event={"ID":"a69229a6-7e04-4039-b08e-09cef56b36ba","Type":"ContainerDied","Data":"74f978eab1023cde812b23268f6896f693a51043ededc6b8f7ba29ec733e6a9a"} Mar 07 04:31:13 crc kubenswrapper[4689]: I0307 04:31:13.284191 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" event={"ID":"a69229a6-7e04-4039-b08e-09cef56b36ba","Type":"ContainerStarted","Data":"9bb67615d5994fec2a6d00264036ea30f6bb863c0448d531044909739f2af3b4"} Mar 07 04:31:16 crc kubenswrapper[4689]: I0307 04:31:16.314872 4689 generic.go:334] "Generic (PLEG): container finished" podID="a69229a6-7e04-4039-b08e-09cef56b36ba" containerID="c6006e26577ee0c44131fdcd67243721d2ef1efe9af4aac6a8dcee29c393d9b4" exitCode=0 Mar 07 04:31:16 crc kubenswrapper[4689]: I0307 04:31:16.314961 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" event={"ID":"a69229a6-7e04-4039-b08e-09cef56b36ba","Type":"ContainerDied","Data":"c6006e26577ee0c44131fdcd67243721d2ef1efe9af4aac6a8dcee29c393d9b4"} Mar 07 04:31:17 crc kubenswrapper[4689]: I0307 04:31:17.327355 4689 generic.go:334] "Generic (PLEG): container finished" podID="a69229a6-7e04-4039-b08e-09cef56b36ba" containerID="2c79d18735dc190641b526e0d2becca6a35ef889c9e1310b5d2cb3555c88bcb3" exitCode=0 Mar 07 04:31:17 crc kubenswrapper[4689]: I0307 04:31:17.327516 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" event={"ID":"a69229a6-7e04-4039-b08e-09cef56b36ba","Type":"ContainerDied","Data":"2c79d18735dc190641b526e0d2becca6a35ef889c9e1310b5d2cb3555c88bcb3"} Mar 07 04:31:18 crc kubenswrapper[4689]: I0307 04:31:18.700386 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:18 crc kubenswrapper[4689]: I0307 04:31:18.780653 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhclr\" (UniqueName: \"kubernetes.io/projected/a69229a6-7e04-4039-b08e-09cef56b36ba-kube-api-access-dhclr\") pod \"a69229a6-7e04-4039-b08e-09cef56b36ba\" (UID: \"a69229a6-7e04-4039-b08e-09cef56b36ba\") " Mar 07 04:31:18 crc kubenswrapper[4689]: I0307 04:31:18.780871 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a69229a6-7e04-4039-b08e-09cef56b36ba-util\") pod \"a69229a6-7e04-4039-b08e-09cef56b36ba\" (UID: \"a69229a6-7e04-4039-b08e-09cef56b36ba\") " Mar 07 04:31:18 crc kubenswrapper[4689]: I0307 04:31:18.780916 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a69229a6-7e04-4039-b08e-09cef56b36ba-bundle\") pod \"a69229a6-7e04-4039-b08e-09cef56b36ba\" (UID: \"a69229a6-7e04-4039-b08e-09cef56b36ba\") " Mar 07 04:31:18 crc kubenswrapper[4689]: I0307 04:31:18.783235 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a69229a6-7e04-4039-b08e-09cef56b36ba-bundle" (OuterVolumeSpecName: "bundle") pod "a69229a6-7e04-4039-b08e-09cef56b36ba" (UID: "a69229a6-7e04-4039-b08e-09cef56b36ba"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:31:18 crc kubenswrapper[4689]: I0307 04:31:18.791570 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69229a6-7e04-4039-b08e-09cef56b36ba-kube-api-access-dhclr" (OuterVolumeSpecName: "kube-api-access-dhclr") pod "a69229a6-7e04-4039-b08e-09cef56b36ba" (UID: "a69229a6-7e04-4039-b08e-09cef56b36ba"). InnerVolumeSpecName "kube-api-access-dhclr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:31:18 crc kubenswrapper[4689]: I0307 04:31:18.883152 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhclr\" (UniqueName: \"kubernetes.io/projected/a69229a6-7e04-4039-b08e-09cef56b36ba-kube-api-access-dhclr\") on node \"crc\" DevicePath \"\"" Mar 07 04:31:18 crc kubenswrapper[4689]: I0307 04:31:18.883248 4689 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a69229a6-7e04-4039-b08e-09cef56b36ba-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:31:19 crc kubenswrapper[4689]: I0307 04:31:19.004239 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a69229a6-7e04-4039-b08e-09cef56b36ba-util" (OuterVolumeSpecName: "util") pod "a69229a6-7e04-4039-b08e-09cef56b36ba" (UID: "a69229a6-7e04-4039-b08e-09cef56b36ba"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:31:19 crc kubenswrapper[4689]: I0307 04:31:19.085492 4689 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a69229a6-7e04-4039-b08e-09cef56b36ba-util\") on node \"crc\" DevicePath \"\"" Mar 07 04:31:19 crc kubenswrapper[4689]: I0307 04:31:19.349157 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" event={"ID":"a69229a6-7e04-4039-b08e-09cef56b36ba","Type":"ContainerDied","Data":"9bb67615d5994fec2a6d00264036ea30f6bb863c0448d531044909739f2af3b4"} Mar 07 04:31:19 crc kubenswrapper[4689]: I0307 04:31:19.349245 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bb67615d5994fec2a6d00264036ea30f6bb863c0448d531044909739f2af3b4" Mar 07 04:31:19 crc kubenswrapper[4689]: I0307 04:31:19.349305 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.116048 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t"] Mar 07 04:31:29 crc kubenswrapper[4689]: E0307 04:31:29.116948 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69229a6-7e04-4039-b08e-09cef56b36ba" containerName="extract" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.116967 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69229a6-7e04-4039-b08e-09cef56b36ba" containerName="extract" Mar 07 04:31:29 crc kubenswrapper[4689]: E0307 04:31:29.116986 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69229a6-7e04-4039-b08e-09cef56b36ba" containerName="util" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.117014 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69229a6-7e04-4039-b08e-09cef56b36ba" containerName="util" Mar 07 04:31:29 crc kubenswrapper[4689]: E0307 04:31:29.117033 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69229a6-7e04-4039-b08e-09cef56b36ba" containerName="pull" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.117044 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69229a6-7e04-4039-b08e-09cef56b36ba" containerName="pull" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.117248 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69229a6-7e04-4039-b08e-09cef56b36ba" containerName="extract" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.117682 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.122464 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.122629 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.122649 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.122637 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.122842 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ffcg9" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.149486 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t"] Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.229224 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c42d4852-f686-4b2c-a03e-735b386d752a-webhook-cert\") pod \"metallb-operator-controller-manager-77cb8466b4-dgs2t\" (UID: \"c42d4852-f686-4b2c-a03e-735b386d752a\") " pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.229359 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhnfc\" (UniqueName: \"kubernetes.io/projected/c42d4852-f686-4b2c-a03e-735b386d752a-kube-api-access-dhnfc\") pod \"metallb-operator-controller-manager-77cb8466b4-dgs2t\" (UID: \"c42d4852-f686-4b2c-a03e-735b386d752a\") " pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.229510 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c42d4852-f686-4b2c-a03e-735b386d752a-apiservice-cert\") pod \"metallb-operator-controller-manager-77cb8466b4-dgs2t\" (UID: \"c42d4852-f686-4b2c-a03e-735b386d752a\") " pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.330613 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhnfc\" (UniqueName: \"kubernetes.io/projected/c42d4852-f686-4b2c-a03e-735b386d752a-kube-api-access-dhnfc\") pod \"metallb-operator-controller-manager-77cb8466b4-dgs2t\" (UID: \"c42d4852-f686-4b2c-a03e-735b386d752a\") " pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.330671 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c42d4852-f686-4b2c-a03e-735b386d752a-apiservice-cert\") pod \"metallb-operator-controller-manager-77cb8466b4-dgs2t\" (UID: \"c42d4852-f686-4b2c-a03e-735b386d752a\") " pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.330704 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c42d4852-f686-4b2c-a03e-735b386d752a-webhook-cert\") pod \"metallb-operator-controller-manager-77cb8466b4-dgs2t\" (UID: \"c42d4852-f686-4b2c-a03e-735b386d752a\") " pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.337020 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c42d4852-f686-4b2c-a03e-735b386d752a-webhook-cert\") pod \"metallb-operator-controller-manager-77cb8466b4-dgs2t\" (UID: \"c42d4852-f686-4b2c-a03e-735b386d752a\") " pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.337588 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c42d4852-f686-4b2c-a03e-735b386d752a-apiservice-cert\") pod \"metallb-operator-controller-manager-77cb8466b4-dgs2t\" (UID: \"c42d4852-f686-4b2c-a03e-735b386d752a\") " pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.350843 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhnfc\" (UniqueName: \"kubernetes.io/projected/c42d4852-f686-4b2c-a03e-735b386d752a-kube-api-access-dhnfc\") pod \"metallb-operator-controller-manager-77cb8466b4-dgs2t\" (UID: \"c42d4852-f686-4b2c-a03e-735b386d752a\") " pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.466792 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.602272 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt"] Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.602842 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.622491 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.622530 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.622636 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-w9qpt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.637109 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt"] Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.738844 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk9f6\" (UniqueName: \"kubernetes.io/projected/3838aa56-d0d3-4bce-95d0-7e760c2be14b-kube-api-access-tk9f6\") pod \"metallb-operator-webhook-server-65b497c9c9-r86tt\" (UID: \"3838aa56-d0d3-4bce-95d0-7e760c2be14b\") " pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.738905 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3838aa56-d0d3-4bce-95d0-7e760c2be14b-apiservice-cert\") pod \"metallb-operator-webhook-server-65b497c9c9-r86tt\" (UID: \"3838aa56-d0d3-4bce-95d0-7e760c2be14b\") " pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.738927 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3838aa56-d0d3-4bce-95d0-7e760c2be14b-webhook-cert\") pod \"metallb-operator-webhook-server-65b497c9c9-r86tt\" (UID: \"3838aa56-d0d3-4bce-95d0-7e760c2be14b\") " pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.839808 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3838aa56-d0d3-4bce-95d0-7e760c2be14b-apiservice-cert\") pod \"metallb-operator-webhook-server-65b497c9c9-r86tt\" (UID: \"3838aa56-d0d3-4bce-95d0-7e760c2be14b\") " pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.839849 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3838aa56-d0d3-4bce-95d0-7e760c2be14b-webhook-cert\") pod \"metallb-operator-webhook-server-65b497c9c9-r86tt\" (UID: \"3838aa56-d0d3-4bce-95d0-7e760c2be14b\") " pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.839932 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk9f6\" (UniqueName: \"kubernetes.io/projected/3838aa56-d0d3-4bce-95d0-7e760c2be14b-kube-api-access-tk9f6\") pod \"metallb-operator-webhook-server-65b497c9c9-r86tt\" (UID: \"3838aa56-d0d3-4bce-95d0-7e760c2be14b\") " pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.844163 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3838aa56-d0d3-4bce-95d0-7e760c2be14b-apiservice-cert\") pod \"metallb-operator-webhook-server-65b497c9c9-r86tt\" (UID: \"3838aa56-d0d3-4bce-95d0-7e760c2be14b\") " pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.847001 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3838aa56-d0d3-4bce-95d0-7e760c2be14b-webhook-cert\") pod \"metallb-operator-webhook-server-65b497c9c9-r86tt\" (UID: \"3838aa56-d0d3-4bce-95d0-7e760c2be14b\") " pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.855935 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk9f6\" (UniqueName: \"kubernetes.io/projected/3838aa56-d0d3-4bce-95d0-7e760c2be14b-kube-api-access-tk9f6\") pod \"metallb-operator-webhook-server-65b497c9c9-r86tt\" (UID: \"3838aa56-d0d3-4bce-95d0-7e760c2be14b\") " pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.943109 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:31:29 crc kubenswrapper[4689]: I0307 04:31:29.979444 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t"] Mar 07 04:31:29 crc kubenswrapper[4689]: W0307 04:31:29.988148 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc42d4852_f686_4b2c_a03e_735b386d752a.slice/crio-b503bf178925d70d8d84a2a97b95e04a2bfce2edb90b2fd03f97c269fbc103dc WatchSource:0}: Error finding container b503bf178925d70d8d84a2a97b95e04a2bfce2edb90b2fd03f97c269fbc103dc: Status 404 returned error can't find the container with id b503bf178925d70d8d84a2a97b95e04a2bfce2edb90b2fd03f97c269fbc103dc Mar 07 04:31:30 crc kubenswrapper[4689]: I0307 04:31:30.167163 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt"] Mar 07 04:31:30 crc kubenswrapper[4689]: W0307 04:31:30.178967 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3838aa56_d0d3_4bce_95d0_7e760c2be14b.slice/crio-edf9b98a54469461939a899af6385b97c2fdae44f9b28df9b3c1c4f44ccbbc09 WatchSource:0}: Error finding container edf9b98a54469461939a899af6385b97c2fdae44f9b28df9b3c1c4f44ccbbc09: Status 404 returned error can't find the container with id edf9b98a54469461939a899af6385b97c2fdae44f9b28df9b3c1c4f44ccbbc09 Mar 07 04:31:30 crc kubenswrapper[4689]: I0307 04:31:30.406889 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" event={"ID":"3838aa56-d0d3-4bce-95d0-7e760c2be14b","Type":"ContainerStarted","Data":"edf9b98a54469461939a899af6385b97c2fdae44f9b28df9b3c1c4f44ccbbc09"} Mar 07 04:31:30 crc kubenswrapper[4689]: I0307 04:31:30.408112 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" event={"ID":"c42d4852-f686-4b2c-a03e-735b386d752a","Type":"ContainerStarted","Data":"b503bf178925d70d8d84a2a97b95e04a2bfce2edb90b2fd03f97c269fbc103dc"} Mar 07 04:31:37 crc kubenswrapper[4689]: I0307 04:31:37.460856 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" event={"ID":"3838aa56-d0d3-4bce-95d0-7e760c2be14b","Type":"ContainerStarted","Data":"22f1bf879e7234786315378b5d91c4369309641fadb7b1a053a51db09b264122"} Mar 07 04:31:37 crc kubenswrapper[4689]: I0307 04:31:37.461742 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:31:37 crc kubenswrapper[4689]: I0307 04:31:37.464082 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" event={"ID":"c42d4852-f686-4b2c-a03e-735b386d752a","Type":"ContainerStarted","Data":"a11aa9298cee702b44734c89c64f024a33c54ab6b82246a8a7fa4d5b92e8af8c"} Mar 07 04:31:37 crc kubenswrapper[4689]: I0307 04:31:37.464335 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:31:37 crc kubenswrapper[4689]: I0307 04:31:37.515846 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" podStartSLOduration=2.121089659 podStartE2EDuration="8.515823091s" podCreationTimestamp="2026-03-07 04:31:29 +0000 UTC" firstStartedPulling="2026-03-07 04:31:29.990545431 +0000 UTC m=+735.036928920" lastFinishedPulling="2026-03-07 04:31:36.385278863 +0000 UTC m=+741.431662352" observedRunningTime="2026-03-07 04:31:37.513155438 +0000 UTC m=+742.559538967" watchObservedRunningTime="2026-03-07 04:31:37.515823091 +0000 UTC m=+742.562206590" Mar 07 04:31:37 crc kubenswrapper[4689]: I0307 04:31:37.519973 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" podStartSLOduration=2.316649771 podStartE2EDuration="8.519960113s" podCreationTimestamp="2026-03-07 04:31:29 +0000 UTC" firstStartedPulling="2026-03-07 04:31:30.181706915 +0000 UTC m=+735.228090404" lastFinishedPulling="2026-03-07 04:31:36.385017257 +0000 UTC m=+741.431400746" observedRunningTime="2026-03-07 04:31:37.48776915 +0000 UTC m=+742.534152679" watchObservedRunningTime="2026-03-07 04:31:37.519960113 +0000 UTC m=+742.566343612" Mar 07 04:31:49 crc kubenswrapper[4689]: I0307 04:31:49.949013 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-65b497c9c9-r86tt" Mar 07 04:32:00 crc kubenswrapper[4689]: I0307 04:32:00.150087 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547632-8mxh5"] Mar 07 04:32:00 crc kubenswrapper[4689]: I0307 04:32:00.152155 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547632-8mxh5" Mar 07 04:32:00 crc kubenswrapper[4689]: I0307 04:32:00.155150 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:32:00 crc kubenswrapper[4689]: I0307 04:32:00.155763 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:32:00 crc kubenswrapper[4689]: I0307 04:32:00.156140 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:32:00 crc kubenswrapper[4689]: I0307 04:32:00.163064 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547632-8mxh5"] Mar 07 04:32:00 crc kubenswrapper[4689]: I0307 04:32:00.235027 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkng2\" (UniqueName: \"kubernetes.io/projected/d3fed712-f790-4590-850a-eea2fe0e36a9-kube-api-access-bkng2\") pod \"auto-csr-approver-29547632-8mxh5\" (UID: \"d3fed712-f790-4590-850a-eea2fe0e36a9\") " pod="openshift-infra/auto-csr-approver-29547632-8mxh5" Mar 07 04:32:00 crc kubenswrapper[4689]: I0307 04:32:00.336342 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkng2\" (UniqueName: \"kubernetes.io/projected/d3fed712-f790-4590-850a-eea2fe0e36a9-kube-api-access-bkng2\") pod \"auto-csr-approver-29547632-8mxh5\" (UID: \"d3fed712-f790-4590-850a-eea2fe0e36a9\") " pod="openshift-infra/auto-csr-approver-29547632-8mxh5" Mar 07 04:32:00 crc kubenswrapper[4689]: I0307 04:32:00.372424 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkng2\" (UniqueName: \"kubernetes.io/projected/d3fed712-f790-4590-850a-eea2fe0e36a9-kube-api-access-bkng2\") pod \"auto-csr-approver-29547632-8mxh5\" (UID: \"d3fed712-f790-4590-850a-eea2fe0e36a9\") " pod="openshift-infra/auto-csr-approver-29547632-8mxh5" Mar 07 04:32:00 crc kubenswrapper[4689]: I0307 04:32:00.471610 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547632-8mxh5" Mar 07 04:32:00 crc kubenswrapper[4689]: I0307 04:32:00.765766 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547632-8mxh5"] Mar 07 04:32:01 crc kubenswrapper[4689]: I0307 04:32:01.623441 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547632-8mxh5" event={"ID":"d3fed712-f790-4590-850a-eea2fe0e36a9","Type":"ContainerStarted","Data":"769c8126d1718c2df97094d2b4373812ceb30649f135bb9c2f424b11e4472784"} Mar 07 04:32:02 crc kubenswrapper[4689]: I0307 04:32:02.633485 4689 generic.go:334] "Generic (PLEG): container finished" podID="d3fed712-f790-4590-850a-eea2fe0e36a9" containerID="31d7a0089316d9e5ab3ae8149601cd5a0cb7f9defe50dda457b7078e8f866103" exitCode=0 Mar 07 04:32:02 crc kubenswrapper[4689]: I0307 04:32:02.633579 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547632-8mxh5" event={"ID":"d3fed712-f790-4590-850a-eea2fe0e36a9","Type":"ContainerDied","Data":"31d7a0089316d9e5ab3ae8149601cd5a0cb7f9defe50dda457b7078e8f866103"} Mar 07 04:32:03 crc kubenswrapper[4689]: I0307 04:32:03.922115 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547632-8mxh5" Mar 07 04:32:03 crc kubenswrapper[4689]: I0307 04:32:03.997713 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkng2\" (UniqueName: \"kubernetes.io/projected/d3fed712-f790-4590-850a-eea2fe0e36a9-kube-api-access-bkng2\") pod \"d3fed712-f790-4590-850a-eea2fe0e36a9\" (UID: \"d3fed712-f790-4590-850a-eea2fe0e36a9\") " Mar 07 04:32:04 crc kubenswrapper[4689]: I0307 04:32:04.026103 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3fed712-f790-4590-850a-eea2fe0e36a9-kube-api-access-bkng2" (OuterVolumeSpecName: "kube-api-access-bkng2") pod "d3fed712-f790-4590-850a-eea2fe0e36a9" (UID: "d3fed712-f790-4590-850a-eea2fe0e36a9"). InnerVolumeSpecName "kube-api-access-bkng2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:32:04 crc kubenswrapper[4689]: I0307 04:32:04.099970 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkng2\" (UniqueName: \"kubernetes.io/projected/d3fed712-f790-4590-850a-eea2fe0e36a9-kube-api-access-bkng2\") on node \"crc\" DevicePath \"\"" Mar 07 04:32:04 crc kubenswrapper[4689]: I0307 04:32:04.650090 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547632-8mxh5" event={"ID":"d3fed712-f790-4590-850a-eea2fe0e36a9","Type":"ContainerDied","Data":"769c8126d1718c2df97094d2b4373812ceb30649f135bb9c2f424b11e4472784"} Mar 07 04:32:04 crc kubenswrapper[4689]: I0307 04:32:04.650484 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="769c8126d1718c2df97094d2b4373812ceb30649f135bb9c2f424b11e4472784" Mar 07 04:32:04 crc kubenswrapper[4689]: I0307 04:32:04.650221 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547632-8mxh5" Mar 07 04:32:05 crc kubenswrapper[4689]: I0307 04:32:05.004588 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547626-qtklx"] Mar 07 04:32:05 crc kubenswrapper[4689]: I0307 04:32:05.012014 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547626-qtklx"] Mar 07 04:32:05 crc kubenswrapper[4689]: I0307 04:32:05.835452 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c64cdb5e-38bb-478e-8fcc-54fb0a234918" path="/var/lib/kubelet/pods/c64cdb5e-38bb-478e-8fcc-54fb0a234918/volumes" Mar 07 04:32:09 crc kubenswrapper[4689]: I0307 04:32:09.471683 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-77cb8466b4-dgs2t" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.198260 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-rgs5v"] Mar 07 04:32:10 crc kubenswrapper[4689]: E0307 04:32:10.199546 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fed712-f790-4590-850a-eea2fe0e36a9" containerName="oc" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.199591 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fed712-f790-4590-850a-eea2fe0e36a9" containerName="oc" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.200336 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fed712-f790-4590-850a-eea2fe0e36a9" containerName="oc" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.214025 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.217864 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.218332 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-792vp" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.218565 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.224449 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7"] Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.225595 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.227520 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.246211 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7"] Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.284795 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2bjl\" (UniqueName: \"kubernetes.io/projected/9f428eff-914b-4bee-a9ee-7399d39a38c0-kube-api-access-q2bjl\") pod \"frr-k8s-webhook-server-7f989f654f-d5xx7\" (UID: \"9f428eff-914b-4bee-a9ee-7399d39a38c0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.284850 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/904924fa-b259-4cf4-8296-a7534f087102-frr-startup\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.284871 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/904924fa-b259-4cf4-8296-a7534f087102-reloader\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.284896 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f428eff-914b-4bee-a9ee-7399d39a38c0-cert\") pod \"frr-k8s-webhook-server-7f989f654f-d5xx7\" (UID: \"9f428eff-914b-4bee-a9ee-7399d39a38c0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.284912 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49gwg\" (UniqueName: \"kubernetes.io/projected/904924fa-b259-4cf4-8296-a7534f087102-kube-api-access-49gwg\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.284936 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/904924fa-b259-4cf4-8296-a7534f087102-frr-conf\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.284950 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/904924fa-b259-4cf4-8296-a7534f087102-metrics\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.284964 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/904924fa-b259-4cf4-8296-a7534f087102-metrics-certs\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.284985 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/904924fa-b259-4cf4-8296-a7534f087102-frr-sockets\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.298402 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-mpvnx"] Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.299408 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.301779 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.301961 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.302012 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.302108 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-5v7kw" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.308442 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-z2zk8"] Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.310975 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.315339 4689 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.318423 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-z2zk8"] Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386055 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/904924fa-b259-4cf4-8296-a7534f087102-frr-sockets\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386111 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6fd9827-217f-4143-94c3-13c5c8257e98-memberlist\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386131 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6gxg\" (UniqueName: \"kubernetes.io/projected/c6fd9827-217f-4143-94c3-13c5c8257e98-kube-api-access-b6gxg\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386182 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2bjl\" (UniqueName: \"kubernetes.io/projected/9f428eff-914b-4bee-a9ee-7399d39a38c0-kube-api-access-q2bjl\") pod \"frr-k8s-webhook-server-7f989f654f-d5xx7\" (UID: \"9f428eff-914b-4bee-a9ee-7399d39a38c0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386233 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c6fd9827-217f-4143-94c3-13c5c8257e98-metallb-excludel2\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386318 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/904924fa-b259-4cf4-8296-a7534f087102-frr-startup\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386348 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6fd9827-217f-4143-94c3-13c5c8257e98-metrics-certs\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386368 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/904924fa-b259-4cf4-8296-a7534f087102-reloader\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386409 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f428eff-914b-4bee-a9ee-7399d39a38c0-cert\") pod \"frr-k8s-webhook-server-7f989f654f-d5xx7\" (UID: \"9f428eff-914b-4bee-a9ee-7399d39a38c0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386440 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49gwg\" (UniqueName: \"kubernetes.io/projected/904924fa-b259-4cf4-8296-a7534f087102-kube-api-access-49gwg\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386473 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwjhn\" (UniqueName: \"kubernetes.io/projected/92ada8da-b00e-4106-8831-bfe7a78d4806-kube-api-access-jwjhn\") pod \"controller-86ddb6bd46-z2zk8\" (UID: \"92ada8da-b00e-4106-8831-bfe7a78d4806\") " pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386502 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92ada8da-b00e-4106-8831-bfe7a78d4806-metrics-certs\") pod \"controller-86ddb6bd46-z2zk8\" (UID: \"92ada8da-b00e-4106-8831-bfe7a78d4806\") " pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386549 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/904924fa-b259-4cf4-8296-a7534f087102-frr-conf\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386547 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/904924fa-b259-4cf4-8296-a7534f087102-frr-sockets\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386583 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/904924fa-b259-4cf4-8296-a7534f087102-metrics\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386609 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92ada8da-b00e-4106-8831-bfe7a78d4806-cert\") pod \"controller-86ddb6bd46-z2zk8\" (UID: \"92ada8da-b00e-4106-8831-bfe7a78d4806\") " pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.386640 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/904924fa-b259-4cf4-8296-a7534f087102-metrics-certs\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: E0307 04:32:10.386953 4689 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 07 04:32:10 crc kubenswrapper[4689]: E0307 04:32:10.387050 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f428eff-914b-4bee-a9ee-7399d39a38c0-cert podName:9f428eff-914b-4bee-a9ee-7399d39a38c0 nodeName:}" failed. No retries permitted until 2026-03-07 04:32:10.887028493 +0000 UTC m=+775.933412072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f428eff-914b-4bee-a9ee-7399d39a38c0-cert") pod "frr-k8s-webhook-server-7f989f654f-d5xx7" (UID: "9f428eff-914b-4bee-a9ee-7399d39a38c0") : secret "frr-k8s-webhook-server-cert" not found Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.387119 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/904924fa-b259-4cf4-8296-a7534f087102-frr-startup\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.387696 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/904924fa-b259-4cf4-8296-a7534f087102-metrics\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.388636 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/904924fa-b259-4cf4-8296-a7534f087102-frr-conf\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.388918 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/904924fa-b259-4cf4-8296-a7534f087102-reloader\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.392108 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/904924fa-b259-4cf4-8296-a7534f087102-metrics-certs\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.408780 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49gwg\" (UniqueName: \"kubernetes.io/projected/904924fa-b259-4cf4-8296-a7534f087102-kube-api-access-49gwg\") pod \"frr-k8s-rgs5v\" (UID: \"904924fa-b259-4cf4-8296-a7534f087102\") " pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.414274 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2bjl\" (UniqueName: \"kubernetes.io/projected/9f428eff-914b-4bee-a9ee-7399d39a38c0-kube-api-access-q2bjl\") pod \"frr-k8s-webhook-server-7f989f654f-d5xx7\" (UID: \"9f428eff-914b-4bee-a9ee-7399d39a38c0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.488101 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c6fd9827-217f-4143-94c3-13c5c8257e98-metallb-excludel2\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.488163 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6fd9827-217f-4143-94c3-13c5c8257e98-metrics-certs\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.488243 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92ada8da-b00e-4106-8831-bfe7a78d4806-metrics-certs\") pod \"controller-86ddb6bd46-z2zk8\" (UID: \"92ada8da-b00e-4106-8831-bfe7a78d4806\") " pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.488264 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwjhn\" (UniqueName: \"kubernetes.io/projected/92ada8da-b00e-4106-8831-bfe7a78d4806-kube-api-access-jwjhn\") pod \"controller-86ddb6bd46-z2zk8\" (UID: \"92ada8da-b00e-4106-8831-bfe7a78d4806\") " pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.488889 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92ada8da-b00e-4106-8831-bfe7a78d4806-cert\") pod \"controller-86ddb6bd46-z2zk8\" (UID: \"92ada8da-b00e-4106-8831-bfe7a78d4806\") " pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.488947 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6fd9827-217f-4143-94c3-13c5c8257e98-memberlist\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.488955 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c6fd9827-217f-4143-94c3-13c5c8257e98-metallb-excludel2\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.488970 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6gxg\" (UniqueName: \"kubernetes.io/projected/c6fd9827-217f-4143-94c3-13c5c8257e98-kube-api-access-b6gxg\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: E0307 04:32:10.489104 4689 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 07 04:32:10 crc kubenswrapper[4689]: E0307 04:32:10.489215 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6fd9827-217f-4143-94c3-13c5c8257e98-memberlist podName:c6fd9827-217f-4143-94c3-13c5c8257e98 nodeName:}" failed. No retries permitted until 2026-03-07 04:32:10.989163862 +0000 UTC m=+776.035547381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c6fd9827-217f-4143-94c3-13c5c8257e98-memberlist") pod "speaker-mpvnx" (UID: "c6fd9827-217f-4143-94c3-13c5c8257e98") : secret "metallb-memberlist" not found Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.494715 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92ada8da-b00e-4106-8831-bfe7a78d4806-metrics-certs\") pod \"controller-86ddb6bd46-z2zk8\" (UID: \"92ada8da-b00e-4106-8831-bfe7a78d4806\") " pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.494780 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92ada8da-b00e-4106-8831-bfe7a78d4806-cert\") pod \"controller-86ddb6bd46-z2zk8\" (UID: \"92ada8da-b00e-4106-8831-bfe7a78d4806\") " pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.511823 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6fd9827-217f-4143-94c3-13c5c8257e98-metrics-certs\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.512126 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwjhn\" (UniqueName: \"kubernetes.io/projected/92ada8da-b00e-4106-8831-bfe7a78d4806-kube-api-access-jwjhn\") pod \"controller-86ddb6bd46-z2zk8\" (UID: \"92ada8da-b00e-4106-8831-bfe7a78d4806\") " pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.515472 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6gxg\" (UniqueName: \"kubernetes.io/projected/c6fd9827-217f-4143-94c3-13c5c8257e98-kube-api-access-b6gxg\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.549543 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.627770 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.685196 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rgs5v" event={"ID":"904924fa-b259-4cf4-8296-a7534f087102","Type":"ContainerStarted","Data":"d665fa6a05669c856ef3699542f4e7a38212a52c42311b1efd31bebf3f84123b"} Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.893521 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f428eff-914b-4bee-a9ee-7399d39a38c0-cert\") pod \"frr-k8s-webhook-server-7f989f654f-d5xx7\" (UID: \"9f428eff-914b-4bee-a9ee-7399d39a38c0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.898390 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-z2zk8"] Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.900224 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f428eff-914b-4bee-a9ee-7399d39a38c0-cert\") pod \"frr-k8s-webhook-server-7f989f654f-d5xx7\" (UID: \"9f428eff-914b-4bee-a9ee-7399d39a38c0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" Mar 07 04:32:10 crc kubenswrapper[4689]: W0307 04:32:10.907845 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92ada8da_b00e_4106_8831_bfe7a78d4806.slice/crio-89e1487d58faad0f68a825694e89b2c6823823629a2f6ee7084be4864650f4b1 WatchSource:0}: Error finding container 89e1487d58faad0f68a825694e89b2c6823823629a2f6ee7084be4864650f4b1: Status 404 returned error can't find the container with id 89e1487d58faad0f68a825694e89b2c6823823629a2f6ee7084be4864650f4b1 Mar 07 04:32:10 crc kubenswrapper[4689]: I0307 04:32:10.995582 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6fd9827-217f-4143-94c3-13c5c8257e98-memberlist\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:10 crc kubenswrapper[4689]: E0307 04:32:10.995782 4689 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 07 04:32:10 crc kubenswrapper[4689]: E0307 04:32:10.996291 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6fd9827-217f-4143-94c3-13c5c8257e98-memberlist podName:c6fd9827-217f-4143-94c3-13c5c8257e98 nodeName:}" failed. No retries permitted until 2026-03-07 04:32:11.996259492 +0000 UTC m=+777.042643011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c6fd9827-217f-4143-94c3-13c5c8257e98-memberlist") pod "speaker-mpvnx" (UID: "c6fd9827-217f-4143-94c3-13c5c8257e98") : secret "metallb-memberlist" not found Mar 07 04:32:11 crc kubenswrapper[4689]: I0307 04:32:11.158893 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" Mar 07 04:32:11 crc kubenswrapper[4689]: I0307 04:32:11.353456 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7"] Mar 07 04:32:11 crc kubenswrapper[4689]: W0307 04:32:11.370013 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f428eff_914b_4bee_a9ee_7399d39a38c0.slice/crio-127392fd6e58d38c8ad00935bc777773d5eb3e2bfec278443a3f9ca415b6a01d WatchSource:0}: Error finding container 127392fd6e58d38c8ad00935bc777773d5eb3e2bfec278443a3f9ca415b6a01d: Status 404 returned error can't find the container with id 127392fd6e58d38c8ad00935bc777773d5eb3e2bfec278443a3f9ca415b6a01d Mar 07 04:32:11 crc kubenswrapper[4689]: I0307 04:32:11.695235 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-z2zk8" event={"ID":"92ada8da-b00e-4106-8831-bfe7a78d4806","Type":"ContainerStarted","Data":"158323296acfd71d2c0391b6e24c2109e0bbd741c27469d9b1e5f08a48769baa"} Mar 07 04:32:11 crc kubenswrapper[4689]: I0307 04:32:11.695561 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-z2zk8" event={"ID":"92ada8da-b00e-4106-8831-bfe7a78d4806","Type":"ContainerStarted","Data":"89e1487d58faad0f68a825694e89b2c6823823629a2f6ee7084be4864650f4b1"} Mar 07 04:32:11 crc kubenswrapper[4689]: I0307 04:32:11.696621 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" event={"ID":"9f428eff-914b-4bee-a9ee-7399d39a38c0","Type":"ContainerStarted","Data":"127392fd6e58d38c8ad00935bc777773d5eb3e2bfec278443a3f9ca415b6a01d"} Mar 07 04:32:12 crc kubenswrapper[4689]: I0307 04:32:12.009856 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6fd9827-217f-4143-94c3-13c5c8257e98-memberlist\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:12 crc kubenswrapper[4689]: I0307 04:32:12.017989 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6fd9827-217f-4143-94c3-13c5c8257e98-memberlist\") pod \"speaker-mpvnx\" (UID: \"c6fd9827-217f-4143-94c3-13c5c8257e98\") " pod="metallb-system/speaker-mpvnx" Mar 07 04:32:12 crc kubenswrapper[4689]: I0307 04:32:12.119904 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mpvnx" Mar 07 04:32:12 crc kubenswrapper[4689]: W0307 04:32:12.151392 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6fd9827_217f_4143_94c3_13c5c8257e98.slice/crio-4ba3188b86a984f92856713915405575ed6475e7fc7ffd3a0e83a412b318bc08 WatchSource:0}: Error finding container 4ba3188b86a984f92856713915405575ed6475e7fc7ffd3a0e83a412b318bc08: Status 404 returned error can't find the container with id 4ba3188b86a984f92856713915405575ed6475e7fc7ffd3a0e83a412b318bc08 Mar 07 04:32:12 crc kubenswrapper[4689]: I0307 04:32:12.705217 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mpvnx" event={"ID":"c6fd9827-217f-4143-94c3-13c5c8257e98","Type":"ContainerStarted","Data":"f28db38ffc74cc4ec1b68468b7eb3fb0f289fc9953c378fb367d2445e01c913e"} Mar 07 04:32:12 crc kubenswrapper[4689]: I0307 04:32:12.705459 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mpvnx" event={"ID":"c6fd9827-217f-4143-94c3-13c5c8257e98","Type":"ContainerStarted","Data":"4ba3188b86a984f92856713915405575ed6475e7fc7ffd3a0e83a412b318bc08"} Mar 07 04:32:14 crc kubenswrapper[4689]: I0307 04:32:14.730987 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-z2zk8" event={"ID":"92ada8da-b00e-4106-8831-bfe7a78d4806","Type":"ContainerStarted","Data":"15b1a8c2c62b1276bf333aa7426e78accd8ce9c5b6ba3b6773d70094476f9b73"} Mar 07 04:32:14 crc kubenswrapper[4689]: I0307 04:32:14.731303 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:14 crc kubenswrapper[4689]: I0307 04:32:14.733264 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mpvnx" event={"ID":"c6fd9827-217f-4143-94c3-13c5c8257e98","Type":"ContainerStarted","Data":"307f5304a810e559139bf6702be1551a0f1ec61c3c866b7ff9ff3e1d3549c3b0"} Mar 07 04:32:14 crc kubenswrapper[4689]: I0307 04:32:14.733822 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-mpvnx" Mar 07 04:32:14 crc kubenswrapper[4689]: I0307 04:32:14.779467 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-z2zk8" podStartSLOduration=1.675027807 podStartE2EDuration="4.779448339s" podCreationTimestamp="2026-03-07 04:32:10 +0000 UTC" firstStartedPulling="2026-03-07 04:32:11.031844407 +0000 UTC m=+776.078227906" lastFinishedPulling="2026-03-07 04:32:14.136264949 +0000 UTC m=+779.182648438" observedRunningTime="2026-03-07 04:32:14.759133508 +0000 UTC m=+779.805516997" watchObservedRunningTime="2026-03-07 04:32:14.779448339 +0000 UTC m=+779.825831828" Mar 07 04:32:14 crc kubenswrapper[4689]: I0307 04:32:14.781897 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-mpvnx" podStartSLOduration=3.087525121 podStartE2EDuration="4.781893156s" podCreationTimestamp="2026-03-07 04:32:10 +0000 UTC" firstStartedPulling="2026-03-07 04:32:12.445557952 +0000 UTC m=+777.491941481" lastFinishedPulling="2026-03-07 04:32:14.139926027 +0000 UTC m=+779.186309516" observedRunningTime="2026-03-07 04:32:14.778904055 +0000 UTC m=+779.825287544" watchObservedRunningTime="2026-03-07 04:32:14.781893156 +0000 UTC m=+779.828276645" Mar 07 04:32:19 crc kubenswrapper[4689]: I0307 04:32:19.774640 4689 generic.go:334] "Generic (PLEG): container finished" podID="904924fa-b259-4cf4-8296-a7534f087102" containerID="f4d9f7a0534f422f3b7de3a8d3d0732da97a107042a1a611d79ea8f36cce00c2" exitCode=0 Mar 07 04:32:19 crc kubenswrapper[4689]: I0307 04:32:19.774710 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rgs5v" event={"ID":"904924fa-b259-4cf4-8296-a7534f087102","Type":"ContainerDied","Data":"f4d9f7a0534f422f3b7de3a8d3d0732da97a107042a1a611d79ea8f36cce00c2"} Mar 07 04:32:19 crc kubenswrapper[4689]: I0307 04:32:19.784497 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" event={"ID":"9f428eff-914b-4bee-a9ee-7399d39a38c0","Type":"ContainerStarted","Data":"5cadddcd122f0b241e34485e253d5a4cdfc93b701e77fa79160eefcbbe8144f3"} Mar 07 04:32:19 crc kubenswrapper[4689]: I0307 04:32:19.784841 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" Mar 07 04:32:19 crc kubenswrapper[4689]: I0307 04:32:19.829633 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" podStartSLOduration=1.7868037289999998 podStartE2EDuration="9.829610342s" podCreationTimestamp="2026-03-07 04:32:10 +0000 UTC" firstStartedPulling="2026-03-07 04:32:11.371858487 +0000 UTC m=+776.418241986" lastFinishedPulling="2026-03-07 04:32:19.41466508 +0000 UTC m=+784.461048599" observedRunningTime="2026-03-07 04:32:19.822449838 +0000 UTC m=+784.868833347" watchObservedRunningTime="2026-03-07 04:32:19.829610342 +0000 UTC m=+784.875993851" Mar 07 04:32:20 crc kubenswrapper[4689]: I0307 04:32:20.791775 4689 generic.go:334] "Generic (PLEG): container finished" podID="904924fa-b259-4cf4-8296-a7534f087102" containerID="fc59a6635c770d3086f383aae09b1a52011db992e85f87136768dd91b0e38c5f" exitCode=0 Mar 07 04:32:20 crc kubenswrapper[4689]: I0307 04:32:20.792841 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rgs5v" event={"ID":"904924fa-b259-4cf4-8296-a7534f087102","Type":"ContainerDied","Data":"fc59a6635c770d3086f383aae09b1a52011db992e85f87136768dd91b0e38c5f"} Mar 07 04:32:21 crc kubenswrapper[4689]: I0307 04:32:21.801612 4689 generic.go:334] "Generic (PLEG): container finished" podID="904924fa-b259-4cf4-8296-a7534f087102" containerID="ed193a05d37e8e45a625d3e324bc30a8e929e764a7f81173cc29d2459454e1e6" exitCode=0 Mar 07 04:32:21 crc kubenswrapper[4689]: I0307 04:32:21.801697 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rgs5v" event={"ID":"904924fa-b259-4cf4-8296-a7534f087102","Type":"ContainerDied","Data":"ed193a05d37e8e45a625d3e324bc30a8e929e764a7f81173cc29d2459454e1e6"} Mar 07 04:32:22 crc kubenswrapper[4689]: I0307 04:32:22.122442 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-mpvnx" Mar 07 04:32:22 crc kubenswrapper[4689]: I0307 04:32:22.811786 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rgs5v" event={"ID":"904924fa-b259-4cf4-8296-a7534f087102","Type":"ContainerStarted","Data":"61c7e214911b6794560a121fed13bd85a78e5abb44bcd5e5d3edfaf0ea5aee2a"} Mar 07 04:32:22 crc kubenswrapper[4689]: I0307 04:32:22.811851 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rgs5v" event={"ID":"904924fa-b259-4cf4-8296-a7534f087102","Type":"ContainerStarted","Data":"cbcb4c86ccbf14c9fe0279fd62c3e00949a9409a82a9fc652a52f000bbed42a2"} Mar 07 04:32:22 crc kubenswrapper[4689]: I0307 04:32:22.811865 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rgs5v" event={"ID":"904924fa-b259-4cf4-8296-a7534f087102","Type":"ContainerStarted","Data":"88307c0692bfc63a9ebc46e50519582bc74931ac2335f564e4d0c88f6753f078"} Mar 07 04:32:23 crc kubenswrapper[4689]: I0307 04:32:23.839703 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:23 crc kubenswrapper[4689]: I0307 04:32:23.840043 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rgs5v" event={"ID":"904924fa-b259-4cf4-8296-a7534f087102","Type":"ContainerStarted","Data":"15c30d1c07ecbdbb3247b65c5c7011cd6f57cd3f422d8b569af855e2c8cd818c"} Mar 07 04:32:23 crc kubenswrapper[4689]: I0307 04:32:23.840078 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rgs5v" event={"ID":"904924fa-b259-4cf4-8296-a7534f087102","Type":"ContainerStarted","Data":"a1ca9772ba574c78fc9cd32ed6df984eb63cb37476863d9ceae35635824949f4"} Mar 07 04:32:23 crc kubenswrapper[4689]: I0307 04:32:23.840109 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rgs5v" event={"ID":"904924fa-b259-4cf4-8296-a7534f087102","Type":"ContainerStarted","Data":"0b7dc40a5ff706e5cf389c1c4ec6658ae35761ebd8e8aa5c1018c4b894f7f893"} Mar 07 04:32:23 crc kubenswrapper[4689]: I0307 04:32:23.867156 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-rgs5v" podStartSLOduration=5.054412386 podStartE2EDuration="13.867123505s" podCreationTimestamp="2026-03-07 04:32:10 +0000 UTC" firstStartedPulling="2026-03-07 04:32:10.653752565 +0000 UTC m=+775.700136064" lastFinishedPulling="2026-03-07 04:32:19.466463654 +0000 UTC m=+784.512847183" observedRunningTime="2026-03-07 04:32:23.860769783 +0000 UTC m=+788.907153322" watchObservedRunningTime="2026-03-07 04:32:23.867123505 +0000 UTC m=+788.913507034" Mar 07 04:32:25 crc kubenswrapper[4689]: I0307 04:32:25.549823 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:25 crc kubenswrapper[4689]: I0307 04:32:25.629851 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:27 crc kubenswrapper[4689]: I0307 04:32:27.617219 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-59v7l"] Mar 07 04:32:27 crc kubenswrapper[4689]: I0307 04:32:27.618329 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-59v7l" Mar 07 04:32:27 crc kubenswrapper[4689]: I0307 04:32:27.620829 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-9pcbg" Mar 07 04:32:27 crc kubenswrapper[4689]: I0307 04:32:27.621389 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 07 04:32:27 crc kubenswrapper[4689]: I0307 04:32:27.621520 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 07 04:32:27 crc kubenswrapper[4689]: I0307 04:32:27.645999 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-59v7l"] Mar 07 04:32:27 crc kubenswrapper[4689]: I0307 04:32:27.696440 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5vv4\" (UniqueName: \"kubernetes.io/projected/e4a1d98f-b91d-4a8b-aff6-9ccce0a17185-kube-api-access-s5vv4\") pod \"mariadb-operator-index-59v7l\" (UID: \"e4a1d98f-b91d-4a8b-aff6-9ccce0a17185\") " pod="openstack-operators/mariadb-operator-index-59v7l" Mar 07 04:32:27 crc kubenswrapper[4689]: I0307 04:32:27.797718 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5vv4\" (UniqueName: \"kubernetes.io/projected/e4a1d98f-b91d-4a8b-aff6-9ccce0a17185-kube-api-access-s5vv4\") pod \"mariadb-operator-index-59v7l\" (UID: \"e4a1d98f-b91d-4a8b-aff6-9ccce0a17185\") " pod="openstack-operators/mariadb-operator-index-59v7l" Mar 07 04:32:27 crc kubenswrapper[4689]: I0307 04:32:27.815026 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5vv4\" (UniqueName: \"kubernetes.io/projected/e4a1d98f-b91d-4a8b-aff6-9ccce0a17185-kube-api-access-s5vv4\") pod \"mariadb-operator-index-59v7l\" (UID: \"e4a1d98f-b91d-4a8b-aff6-9ccce0a17185\") " pod="openstack-operators/mariadb-operator-index-59v7l" Mar 07 04:32:27 crc kubenswrapper[4689]: I0307 04:32:27.944025 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-59v7l" Mar 07 04:32:28 crc kubenswrapper[4689]: I0307 04:32:28.174572 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-59v7l"] Mar 07 04:32:28 crc kubenswrapper[4689]: I0307 04:32:28.863003 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-59v7l" event={"ID":"e4a1d98f-b91d-4a8b-aff6-9ccce0a17185","Type":"ContainerStarted","Data":"9775bf8458ec7274273c7db6b9cdb7bd71985ed567932b97c141bd33c158ba54"} Mar 07 04:32:29 crc kubenswrapper[4689]: I0307 04:32:29.190401 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:32:29 crc kubenswrapper[4689]: I0307 04:32:29.190492 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:32:30 crc kubenswrapper[4689]: I0307 04:32:30.638138 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-z2zk8" Mar 07 04:32:30 crc kubenswrapper[4689]: I0307 04:32:30.883397 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-59v7l" event={"ID":"e4a1d98f-b91d-4a8b-aff6-9ccce0a17185","Type":"ContainerStarted","Data":"bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150"} Mar 07 04:32:30 crc kubenswrapper[4689]: I0307 04:32:30.907941 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-59v7l" podStartSLOduration=2.024804633 podStartE2EDuration="3.907926147s" podCreationTimestamp="2026-03-07 04:32:27 +0000 UTC" firstStartedPulling="2026-03-07 04:32:28.184475196 +0000 UTC m=+793.230858695" lastFinishedPulling="2026-03-07 04:32:30.06759669 +0000 UTC m=+795.113980209" observedRunningTime="2026-03-07 04:32:30.904146914 +0000 UTC m=+795.950530423" watchObservedRunningTime="2026-03-07 04:32:30.907926147 +0000 UTC m=+795.954309636" Mar 07 04:32:30 crc kubenswrapper[4689]: I0307 04:32:30.990534 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-59v7l"] Mar 07 04:32:31 crc kubenswrapper[4689]: I0307 04:32:31.168422 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-d5xx7" Mar 07 04:32:31 crc kubenswrapper[4689]: I0307 04:32:31.596996 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-mzlbx"] Mar 07 04:32:31 crc kubenswrapper[4689]: I0307 04:32:31.597907 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-mzlbx" Mar 07 04:32:31 crc kubenswrapper[4689]: I0307 04:32:31.610533 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-mzlbx"] Mar 07 04:32:31 crc kubenswrapper[4689]: I0307 04:32:31.754994 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss6k7\" (UniqueName: \"kubernetes.io/projected/ee3b71b3-32e5-46a2-9f3e-589e7da005a4-kube-api-access-ss6k7\") pod \"mariadb-operator-index-mzlbx\" (UID: \"ee3b71b3-32e5-46a2-9f3e-589e7da005a4\") " pod="openstack-operators/mariadb-operator-index-mzlbx" Mar 07 04:32:31 crc kubenswrapper[4689]: I0307 04:32:31.856807 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss6k7\" (UniqueName: \"kubernetes.io/projected/ee3b71b3-32e5-46a2-9f3e-589e7da005a4-kube-api-access-ss6k7\") pod \"mariadb-operator-index-mzlbx\" (UID: \"ee3b71b3-32e5-46a2-9f3e-589e7da005a4\") " pod="openstack-operators/mariadb-operator-index-mzlbx" Mar 07 04:32:31 crc kubenswrapper[4689]: I0307 04:32:31.880192 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss6k7\" (UniqueName: \"kubernetes.io/projected/ee3b71b3-32e5-46a2-9f3e-589e7da005a4-kube-api-access-ss6k7\") pod \"mariadb-operator-index-mzlbx\" (UID: \"ee3b71b3-32e5-46a2-9f3e-589e7da005a4\") " pod="openstack-operators/mariadb-operator-index-mzlbx" Mar 07 04:32:31 crc kubenswrapper[4689]: I0307 04:32:31.925570 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-mzlbx" Mar 07 04:32:32 crc kubenswrapper[4689]: I0307 04:32:32.345200 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-mzlbx"] Mar 07 04:32:32 crc kubenswrapper[4689]: I0307 04:32:32.898285 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-59v7l" podUID="e4a1d98f-b91d-4a8b-aff6-9ccce0a17185" containerName="registry-server" containerID="cri-o://bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150" gracePeriod=2 Mar 07 04:32:32 crc kubenswrapper[4689]: I0307 04:32:32.898275 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-mzlbx" event={"ID":"ee3b71b3-32e5-46a2-9f3e-589e7da005a4","Type":"ContainerStarted","Data":"4c260547acbd1a6f12d9a180124b1fbb76177264802150fd79fb5ea023b010dc"} Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.648250 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-59v7l" Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.783798 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5vv4\" (UniqueName: \"kubernetes.io/projected/e4a1d98f-b91d-4a8b-aff6-9ccce0a17185-kube-api-access-s5vv4\") pod \"e4a1d98f-b91d-4a8b-aff6-9ccce0a17185\" (UID: \"e4a1d98f-b91d-4a8b-aff6-9ccce0a17185\") " Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.792011 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a1d98f-b91d-4a8b-aff6-9ccce0a17185-kube-api-access-s5vv4" (OuterVolumeSpecName: "kube-api-access-s5vv4") pod "e4a1d98f-b91d-4a8b-aff6-9ccce0a17185" (UID: "e4a1d98f-b91d-4a8b-aff6-9ccce0a17185"). InnerVolumeSpecName "kube-api-access-s5vv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.885658 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5vv4\" (UniqueName: \"kubernetes.io/projected/e4a1d98f-b91d-4a8b-aff6-9ccce0a17185-kube-api-access-s5vv4\") on node \"crc\" DevicePath \"\"" Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.908648 4689 generic.go:334] "Generic (PLEG): container finished" podID="e4a1d98f-b91d-4a8b-aff6-9ccce0a17185" containerID="bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150" exitCode=0 Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.908736 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-59v7l" Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.908727 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-59v7l" event={"ID":"e4a1d98f-b91d-4a8b-aff6-9ccce0a17185","Type":"ContainerDied","Data":"bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150"} Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.909107 4689 scope.go:117] "RemoveContainer" containerID="bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150" Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.909265 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-59v7l" event={"ID":"e4a1d98f-b91d-4a8b-aff6-9ccce0a17185","Type":"ContainerDied","Data":"9775bf8458ec7274273c7db6b9cdb7bd71985ed567932b97c141bd33c158ba54"} Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.932642 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-59v7l"] Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.936646 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-59v7l"] Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.944354 4689 scope.go:117] "RemoveContainer" containerID="bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150" Mar 07 04:32:33 crc kubenswrapper[4689]: E0307 04:32:33.944904 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150\": container with ID starting with bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150 not found: ID does not exist" containerID="bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150" Mar 07 04:32:33 crc kubenswrapper[4689]: I0307 04:32:33.944968 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150"} err="failed to get container status \"bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150\": rpc error: code = NotFound desc = could not find container \"bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150\": container with ID starting with bab5a39bcb53596ed5a1c91d2210dff8227b4c99044aae215ab55053091fb150 not found: ID does not exist" Mar 07 04:32:34 crc kubenswrapper[4689]: I0307 04:32:34.923331 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-mzlbx" event={"ID":"ee3b71b3-32e5-46a2-9f3e-589e7da005a4","Type":"ContainerStarted","Data":"ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695"} Mar 07 04:32:34 crc kubenswrapper[4689]: I0307 04:32:34.951688 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-mzlbx" podStartSLOduration=2.343880751 podStartE2EDuration="3.951665038s" podCreationTimestamp="2026-03-07 04:32:31 +0000 UTC" firstStartedPulling="2026-03-07 04:32:32.357176635 +0000 UTC m=+797.403560124" lastFinishedPulling="2026-03-07 04:32:33.964960902 +0000 UTC m=+799.011344411" observedRunningTime="2026-03-07 04:32:34.945706167 +0000 UTC m=+799.992089706" watchObservedRunningTime="2026-03-07 04:32:34.951665038 +0000 UTC m=+799.998048537" Mar 07 04:32:35 crc kubenswrapper[4689]: I0307 04:32:35.834114 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a1d98f-b91d-4a8b-aff6-9ccce0a17185" path="/var/lib/kubelet/pods/e4a1d98f-b91d-4a8b-aff6-9ccce0a17185/volumes" Mar 07 04:32:40 crc kubenswrapper[4689]: I0307 04:32:40.555300 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-rgs5v" Mar 07 04:32:41 crc kubenswrapper[4689]: I0307 04:32:41.926264 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-mzlbx" Mar 07 04:32:41 crc kubenswrapper[4689]: I0307 04:32:41.926660 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-mzlbx" Mar 07 04:32:41 crc kubenswrapper[4689]: I0307 04:32:41.972863 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-mzlbx" Mar 07 04:32:42 crc kubenswrapper[4689]: I0307 04:32:42.020602 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-mzlbx" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.323560 4689 scope.go:117] "RemoveContainer" containerID="044b0606461e6ef3ea35a49511ea31bbbc16ced31b504b18a99d6cea618859e7" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.661017 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk"] Mar 07 04:32:49 crc kubenswrapper[4689]: E0307 04:32:49.661475 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a1d98f-b91d-4a8b-aff6-9ccce0a17185" containerName="registry-server" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.661523 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a1d98f-b91d-4a8b-aff6-9ccce0a17185" containerName="registry-server" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.661917 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a1d98f-b91d-4a8b-aff6-9ccce0a17185" containerName="registry-server" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.666833 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.670905 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4j8gt" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.677495 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk"] Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.819423 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlvmz\" (UniqueName: \"kubernetes.io/projected/99f93314-9b2f-4bac-90ac-20c44ed8b998-kube-api-access-jlvmz\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk\" (UID: \"99f93314-9b2f-4bac-90ac-20c44ed8b998\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.819496 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f93314-9b2f-4bac-90ac-20c44ed8b998-bundle\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk\" (UID: \"99f93314-9b2f-4bac-90ac-20c44ed8b998\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.819654 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f93314-9b2f-4bac-90ac-20c44ed8b998-util\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk\" (UID: \"99f93314-9b2f-4bac-90ac-20c44ed8b998\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.920924 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f93314-9b2f-4bac-90ac-20c44ed8b998-util\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk\" (UID: \"99f93314-9b2f-4bac-90ac-20c44ed8b998\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.921637 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlvmz\" (UniqueName: \"kubernetes.io/projected/99f93314-9b2f-4bac-90ac-20c44ed8b998-kube-api-access-jlvmz\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk\" (UID: \"99f93314-9b2f-4bac-90ac-20c44ed8b998\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.921717 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f93314-9b2f-4bac-90ac-20c44ed8b998-bundle\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk\" (UID: \"99f93314-9b2f-4bac-90ac-20c44ed8b998\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.921742 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f93314-9b2f-4bac-90ac-20c44ed8b998-util\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk\" (UID: \"99f93314-9b2f-4bac-90ac-20c44ed8b998\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.922476 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f93314-9b2f-4bac-90ac-20c44ed8b998-bundle\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk\" (UID: \"99f93314-9b2f-4bac-90ac-20c44ed8b998\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.993025 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlvmz\" (UniqueName: \"kubernetes.io/projected/99f93314-9b2f-4bac-90ac-20c44ed8b998-kube-api-access-jlvmz\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk\" (UID: \"99f93314-9b2f-4bac-90ac-20c44ed8b998\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:49 crc kubenswrapper[4689]: I0307 04:32:49.994008 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:50 crc kubenswrapper[4689]: I0307 04:32:50.254463 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk"] Mar 07 04:32:51 crc kubenswrapper[4689]: I0307 04:32:51.059463 4689 generic.go:334] "Generic (PLEG): container finished" podID="99f93314-9b2f-4bac-90ac-20c44ed8b998" containerID="437d29f2802f7fdeac8810f3b54b21fdbf943ed5cf5e20263f57909ee89728a1" exitCode=0 Mar 07 04:32:51 crc kubenswrapper[4689]: I0307 04:32:51.059546 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" event={"ID":"99f93314-9b2f-4bac-90ac-20c44ed8b998","Type":"ContainerDied","Data":"437d29f2802f7fdeac8810f3b54b21fdbf943ed5cf5e20263f57909ee89728a1"} Mar 07 04:32:51 crc kubenswrapper[4689]: I0307 04:32:51.059630 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" event={"ID":"99f93314-9b2f-4bac-90ac-20c44ed8b998","Type":"ContainerStarted","Data":"2abf65e2b9911253646904d29d67f250ff0f5a4233936f4f42d5c105d0de1afa"} Mar 07 04:32:52 crc kubenswrapper[4689]: I0307 04:32:52.067019 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" event={"ID":"99f93314-9b2f-4bac-90ac-20c44ed8b998","Type":"ContainerStarted","Data":"5d6a86917778a56ef7e6a6e34b7267230910937ec2cffcd68f2763f316f886d9"} Mar 07 04:32:53 crc kubenswrapper[4689]: I0307 04:32:53.078914 4689 generic.go:334] "Generic (PLEG): container finished" podID="99f93314-9b2f-4bac-90ac-20c44ed8b998" containerID="5d6a86917778a56ef7e6a6e34b7267230910937ec2cffcd68f2763f316f886d9" exitCode=0 Mar 07 04:32:53 crc kubenswrapper[4689]: I0307 04:32:53.078973 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" event={"ID":"99f93314-9b2f-4bac-90ac-20c44ed8b998","Type":"ContainerDied","Data":"5d6a86917778a56ef7e6a6e34b7267230910937ec2cffcd68f2763f316f886d9"} Mar 07 04:32:54 crc kubenswrapper[4689]: I0307 04:32:54.088115 4689 generic.go:334] "Generic (PLEG): container finished" podID="99f93314-9b2f-4bac-90ac-20c44ed8b998" containerID="37fc54a838abbd84f29cf51bdff86617b52601ce04d177aa8f6a9684fa59f10e" exitCode=0 Mar 07 04:32:54 crc kubenswrapper[4689]: I0307 04:32:54.088225 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" event={"ID":"99f93314-9b2f-4bac-90ac-20c44ed8b998","Type":"ContainerDied","Data":"37fc54a838abbd84f29cf51bdff86617b52601ce04d177aa8f6a9684fa59f10e"} Mar 07 04:32:55 crc kubenswrapper[4689]: I0307 04:32:55.386871 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:55 crc kubenswrapper[4689]: I0307 04:32:55.505670 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f93314-9b2f-4bac-90ac-20c44ed8b998-util\") pod \"99f93314-9b2f-4bac-90ac-20c44ed8b998\" (UID: \"99f93314-9b2f-4bac-90ac-20c44ed8b998\") " Mar 07 04:32:55 crc kubenswrapper[4689]: I0307 04:32:55.505794 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlvmz\" (UniqueName: \"kubernetes.io/projected/99f93314-9b2f-4bac-90ac-20c44ed8b998-kube-api-access-jlvmz\") pod \"99f93314-9b2f-4bac-90ac-20c44ed8b998\" (UID: \"99f93314-9b2f-4bac-90ac-20c44ed8b998\") " Mar 07 04:32:55 crc kubenswrapper[4689]: I0307 04:32:55.505834 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f93314-9b2f-4bac-90ac-20c44ed8b998-bundle\") pod \"99f93314-9b2f-4bac-90ac-20c44ed8b998\" (UID: \"99f93314-9b2f-4bac-90ac-20c44ed8b998\") " Mar 07 04:32:55 crc kubenswrapper[4689]: I0307 04:32:55.507563 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f93314-9b2f-4bac-90ac-20c44ed8b998-bundle" (OuterVolumeSpecName: "bundle") pod "99f93314-9b2f-4bac-90ac-20c44ed8b998" (UID: "99f93314-9b2f-4bac-90ac-20c44ed8b998"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:32:55 crc kubenswrapper[4689]: I0307 04:32:55.511061 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f93314-9b2f-4bac-90ac-20c44ed8b998-kube-api-access-jlvmz" (OuterVolumeSpecName: "kube-api-access-jlvmz") pod "99f93314-9b2f-4bac-90ac-20c44ed8b998" (UID: "99f93314-9b2f-4bac-90ac-20c44ed8b998"). InnerVolumeSpecName "kube-api-access-jlvmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:32:55 crc kubenswrapper[4689]: I0307 04:32:55.533872 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f93314-9b2f-4bac-90ac-20c44ed8b998-util" (OuterVolumeSpecName: "util") pod "99f93314-9b2f-4bac-90ac-20c44ed8b998" (UID: "99f93314-9b2f-4bac-90ac-20c44ed8b998"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:32:55 crc kubenswrapper[4689]: I0307 04:32:55.608034 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlvmz\" (UniqueName: \"kubernetes.io/projected/99f93314-9b2f-4bac-90ac-20c44ed8b998-kube-api-access-jlvmz\") on node \"crc\" DevicePath \"\"" Mar 07 04:32:55 crc kubenswrapper[4689]: I0307 04:32:55.608114 4689 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f93314-9b2f-4bac-90ac-20c44ed8b998-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:32:55 crc kubenswrapper[4689]: I0307 04:32:55.608139 4689 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f93314-9b2f-4bac-90ac-20c44ed8b998-util\") on node \"crc\" DevicePath \"\"" Mar 07 04:32:56 crc kubenswrapper[4689]: I0307 04:32:56.112363 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" event={"ID":"99f93314-9b2f-4bac-90ac-20c44ed8b998","Type":"ContainerDied","Data":"2abf65e2b9911253646904d29d67f250ff0f5a4233936f4f42d5c105d0de1afa"} Mar 07 04:32:56 crc kubenswrapper[4689]: I0307 04:32:56.112424 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2abf65e2b9911253646904d29d67f250ff0f5a4233936f4f42d5c105d0de1afa" Mar 07 04:32:56 crc kubenswrapper[4689]: I0307 04:32:56.112534 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk" Mar 07 04:32:59 crc kubenswrapper[4689]: I0307 04:32:59.190048 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:32:59 crc kubenswrapper[4689]: I0307 04:32:59.190424 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:33:02 crc kubenswrapper[4689]: I0307 04:33:02.822017 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr"] Mar 07 04:33:02 crc kubenswrapper[4689]: E0307 04:33:02.822430 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f93314-9b2f-4bac-90ac-20c44ed8b998" containerName="util" Mar 07 04:33:02 crc kubenswrapper[4689]: I0307 04:33:02.822441 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f93314-9b2f-4bac-90ac-20c44ed8b998" containerName="util" Mar 07 04:33:02 crc kubenswrapper[4689]: E0307 04:33:02.822455 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f93314-9b2f-4bac-90ac-20c44ed8b998" containerName="pull" Mar 07 04:33:02 crc kubenswrapper[4689]: I0307 04:33:02.822461 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f93314-9b2f-4bac-90ac-20c44ed8b998" containerName="pull" Mar 07 04:33:02 crc kubenswrapper[4689]: E0307 04:33:02.822471 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f93314-9b2f-4bac-90ac-20c44ed8b998" containerName="extract" Mar 07 04:33:02 crc kubenswrapper[4689]: I0307 04:33:02.822478 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f93314-9b2f-4bac-90ac-20c44ed8b998" containerName="extract" Mar 07 04:33:02 crc kubenswrapper[4689]: I0307 04:33:02.822561 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f93314-9b2f-4bac-90ac-20c44ed8b998" containerName="extract" Mar 07 04:33:02 crc kubenswrapper[4689]: I0307 04:33:02.822907 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:02 crc kubenswrapper[4689]: I0307 04:33:02.824550 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Mar 07 04:33:02 crc kubenswrapper[4689]: I0307 04:33:02.824581 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 07 04:33:02 crc kubenswrapper[4689]: I0307 04:33:02.824927 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-7hxdv" Mar 07 04:33:02 crc kubenswrapper[4689]: I0307 04:33:02.869337 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr"] Mar 07 04:33:03 crc kubenswrapper[4689]: I0307 04:33:03.012616 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44ccc3e9-523e-49f4-a647-87bad23b837f-apiservice-cert\") pod \"mariadb-operator-controller-manager-64bcb8dbcf-9hmhr\" (UID: \"44ccc3e9-523e-49f4-a647-87bad23b837f\") " pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:03 crc kubenswrapper[4689]: I0307 04:33:03.012742 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44ccc3e9-523e-49f4-a647-87bad23b837f-webhook-cert\") pod \"mariadb-operator-controller-manager-64bcb8dbcf-9hmhr\" (UID: \"44ccc3e9-523e-49f4-a647-87bad23b837f\") " pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:03 crc kubenswrapper[4689]: I0307 04:33:03.012824 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr5mq\" (UniqueName: \"kubernetes.io/projected/44ccc3e9-523e-49f4-a647-87bad23b837f-kube-api-access-wr5mq\") pod \"mariadb-operator-controller-manager-64bcb8dbcf-9hmhr\" (UID: \"44ccc3e9-523e-49f4-a647-87bad23b837f\") " pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:03 crc kubenswrapper[4689]: I0307 04:33:03.114383 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr5mq\" (UniqueName: \"kubernetes.io/projected/44ccc3e9-523e-49f4-a647-87bad23b837f-kube-api-access-wr5mq\") pod \"mariadb-operator-controller-manager-64bcb8dbcf-9hmhr\" (UID: \"44ccc3e9-523e-49f4-a647-87bad23b837f\") " pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:03 crc kubenswrapper[4689]: I0307 04:33:03.114830 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44ccc3e9-523e-49f4-a647-87bad23b837f-apiservice-cert\") pod \"mariadb-operator-controller-manager-64bcb8dbcf-9hmhr\" (UID: \"44ccc3e9-523e-49f4-a647-87bad23b837f\") " pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:03 crc kubenswrapper[4689]: I0307 04:33:03.114900 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44ccc3e9-523e-49f4-a647-87bad23b837f-webhook-cert\") pod \"mariadb-operator-controller-manager-64bcb8dbcf-9hmhr\" (UID: \"44ccc3e9-523e-49f4-a647-87bad23b837f\") " pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:03 crc kubenswrapper[4689]: I0307 04:33:03.122640 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44ccc3e9-523e-49f4-a647-87bad23b837f-apiservice-cert\") pod \"mariadb-operator-controller-manager-64bcb8dbcf-9hmhr\" (UID: \"44ccc3e9-523e-49f4-a647-87bad23b837f\") " pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:03 crc kubenswrapper[4689]: I0307 04:33:03.129757 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44ccc3e9-523e-49f4-a647-87bad23b837f-webhook-cert\") pod \"mariadb-operator-controller-manager-64bcb8dbcf-9hmhr\" (UID: \"44ccc3e9-523e-49f4-a647-87bad23b837f\") " pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:03 crc kubenswrapper[4689]: I0307 04:33:03.134750 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr5mq\" (UniqueName: \"kubernetes.io/projected/44ccc3e9-523e-49f4-a647-87bad23b837f-kube-api-access-wr5mq\") pod \"mariadb-operator-controller-manager-64bcb8dbcf-9hmhr\" (UID: \"44ccc3e9-523e-49f4-a647-87bad23b837f\") " pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:03 crc kubenswrapper[4689]: I0307 04:33:03.140409 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:03 crc kubenswrapper[4689]: I0307 04:33:03.406708 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr"] Mar 07 04:33:03 crc kubenswrapper[4689]: W0307 04:33:03.419701 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ccc3e9_523e_49f4_a647_87bad23b837f.slice/crio-af4641fb206d395bc745a9599f9368d79e2a02f08c9f00e13f3c94db2e2a1bd0 WatchSource:0}: Error finding container af4641fb206d395bc745a9599f9368d79e2a02f08c9f00e13f3c94db2e2a1bd0: Status 404 returned error can't find the container with id af4641fb206d395bc745a9599f9368d79e2a02f08c9f00e13f3c94db2e2a1bd0 Mar 07 04:33:04 crc kubenswrapper[4689]: I0307 04:33:04.162248 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" event={"ID":"44ccc3e9-523e-49f4-a647-87bad23b837f","Type":"ContainerStarted","Data":"af4641fb206d395bc745a9599f9368d79e2a02f08c9f00e13f3c94db2e2a1bd0"} Mar 07 04:33:07 crc kubenswrapper[4689]: I0307 04:33:07.189464 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" event={"ID":"44ccc3e9-523e-49f4-a647-87bad23b837f","Type":"ContainerStarted","Data":"931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee"} Mar 07 04:33:07 crc kubenswrapper[4689]: I0307 04:33:07.191935 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:07 crc kubenswrapper[4689]: I0307 04:33:07.228583 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" podStartSLOduration=1.912034764 podStartE2EDuration="5.228564397s" podCreationTimestamp="2026-03-07 04:33:02 +0000 UTC" firstStartedPulling="2026-03-07 04:33:03.42212916 +0000 UTC m=+828.468512649" lastFinishedPulling="2026-03-07 04:33:06.738658793 +0000 UTC m=+831.785042282" observedRunningTime="2026-03-07 04:33:07.225783142 +0000 UTC m=+832.272166671" watchObservedRunningTime="2026-03-07 04:33:07.228564397 +0000 UTC m=+832.274947896" Mar 07 04:33:13 crc kubenswrapper[4689]: I0307 04:33:13.147534 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:33:14 crc kubenswrapper[4689]: I0307 04:33:14.047089 4689 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 04:33:16 crc kubenswrapper[4689]: I0307 04:33:16.391053 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-5ps7l"] Mar 07 04:33:16 crc kubenswrapper[4689]: I0307 04:33:16.392821 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5ps7l" Mar 07 04:33:16 crc kubenswrapper[4689]: I0307 04:33:16.394234 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-n85rh" Mar 07 04:33:16 crc kubenswrapper[4689]: I0307 04:33:16.400112 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-5ps7l"] Mar 07 04:33:16 crc kubenswrapper[4689]: I0307 04:33:16.512971 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j46p5\" (UniqueName: \"kubernetes.io/projected/83abe50d-3920-4550-af87-bd6b0d865e9d-kube-api-access-j46p5\") pod \"infra-operator-index-5ps7l\" (UID: \"83abe50d-3920-4550-af87-bd6b0d865e9d\") " pod="openstack-operators/infra-operator-index-5ps7l" Mar 07 04:33:16 crc kubenswrapper[4689]: I0307 04:33:16.613816 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j46p5\" (UniqueName: \"kubernetes.io/projected/83abe50d-3920-4550-af87-bd6b0d865e9d-kube-api-access-j46p5\") pod \"infra-operator-index-5ps7l\" (UID: \"83abe50d-3920-4550-af87-bd6b0d865e9d\") " pod="openstack-operators/infra-operator-index-5ps7l" Mar 07 04:33:16 crc kubenswrapper[4689]: I0307 04:33:16.653517 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j46p5\" (UniqueName: \"kubernetes.io/projected/83abe50d-3920-4550-af87-bd6b0d865e9d-kube-api-access-j46p5\") pod \"infra-operator-index-5ps7l\" (UID: \"83abe50d-3920-4550-af87-bd6b0d865e9d\") " pod="openstack-operators/infra-operator-index-5ps7l" Mar 07 04:33:16 crc kubenswrapper[4689]: I0307 04:33:16.720606 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5ps7l" Mar 07 04:33:16 crc kubenswrapper[4689]: I0307 04:33:16.997482 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-5ps7l"] Mar 07 04:33:17 crc kubenswrapper[4689]: W0307 04:33:17.022329 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83abe50d_3920_4550_af87_bd6b0d865e9d.slice/crio-8097b57f2d3f47cee729fa6f26175e76a5d2b57c858035232945789521d0bb05 WatchSource:0}: Error finding container 8097b57f2d3f47cee729fa6f26175e76a5d2b57c858035232945789521d0bb05: Status 404 returned error can't find the container with id 8097b57f2d3f47cee729fa6f26175e76a5d2b57c858035232945789521d0bb05 Mar 07 04:33:17 crc kubenswrapper[4689]: I0307 04:33:17.990537 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5ps7l" event={"ID":"83abe50d-3920-4550-af87-bd6b0d865e9d","Type":"ContainerStarted","Data":"8097b57f2d3f47cee729fa6f26175e76a5d2b57c858035232945789521d0bb05"} Mar 07 04:33:19 crc kubenswrapper[4689]: I0307 04:33:19.001618 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5ps7l" event={"ID":"83abe50d-3920-4550-af87-bd6b0d865e9d","Type":"ContainerStarted","Data":"39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f"} Mar 07 04:33:19 crc kubenswrapper[4689]: I0307 04:33:19.028893 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-5ps7l" podStartSLOduration=2.153457923 podStartE2EDuration="3.02887144s" podCreationTimestamp="2026-03-07 04:33:16 +0000 UTC" firstStartedPulling="2026-03-07 04:33:17.02457191 +0000 UTC m=+842.070955429" lastFinishedPulling="2026-03-07 04:33:17.899985447 +0000 UTC m=+842.946368946" observedRunningTime="2026-03-07 04:33:19.02631235 +0000 UTC m=+844.072695909" watchObservedRunningTime="2026-03-07 04:33:19.02887144 +0000 UTC m=+844.075254949" Mar 07 04:33:19 crc kubenswrapper[4689]: I0307 04:33:19.596282 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-5ps7l"] Mar 07 04:33:20 crc kubenswrapper[4689]: I0307 04:33:20.199445 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-rnf5j"] Mar 07 04:33:20 crc kubenswrapper[4689]: I0307 04:33:20.200866 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-rnf5j" Mar 07 04:33:20 crc kubenswrapper[4689]: I0307 04:33:20.209722 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-rnf5j"] Mar 07 04:33:20 crc kubenswrapper[4689]: I0307 04:33:20.369977 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrkkm\" (UniqueName: \"kubernetes.io/projected/b07d06d3-554f-4c41-b001-e5d9338bbdf4-kube-api-access-qrkkm\") pod \"infra-operator-index-rnf5j\" (UID: \"b07d06d3-554f-4c41-b001-e5d9338bbdf4\") " pod="openstack-operators/infra-operator-index-rnf5j" Mar 07 04:33:20 crc kubenswrapper[4689]: I0307 04:33:20.470973 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrkkm\" (UniqueName: \"kubernetes.io/projected/b07d06d3-554f-4c41-b001-e5d9338bbdf4-kube-api-access-qrkkm\") pod \"infra-operator-index-rnf5j\" (UID: \"b07d06d3-554f-4c41-b001-e5d9338bbdf4\") " pod="openstack-operators/infra-operator-index-rnf5j" Mar 07 04:33:20 crc kubenswrapper[4689]: I0307 04:33:20.509144 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrkkm\" (UniqueName: \"kubernetes.io/projected/b07d06d3-554f-4c41-b001-e5d9338bbdf4-kube-api-access-qrkkm\") pod \"infra-operator-index-rnf5j\" (UID: \"b07d06d3-554f-4c41-b001-e5d9338bbdf4\") " pod="openstack-operators/infra-operator-index-rnf5j" Mar 07 04:33:20 crc kubenswrapper[4689]: I0307 04:33:20.542024 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-rnf5j" Mar 07 04:33:20 crc kubenswrapper[4689]: I0307 04:33:20.788296 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-rnf5j"] Mar 07 04:33:20 crc kubenswrapper[4689]: W0307 04:33:20.794812 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb07d06d3_554f_4c41_b001_e5d9338bbdf4.slice/crio-447a44482c07136c3563632351627ce42e19ecddc6fcd3fade3598bc082c8ec0 WatchSource:0}: Error finding container 447a44482c07136c3563632351627ce42e19ecddc6fcd3fade3598bc082c8ec0: Status 404 returned error can't find the container with id 447a44482c07136c3563632351627ce42e19ecddc6fcd3fade3598bc082c8ec0 Mar 07 04:33:21 crc kubenswrapper[4689]: I0307 04:33:21.019045 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-rnf5j" event={"ID":"b07d06d3-554f-4c41-b001-e5d9338bbdf4","Type":"ContainerStarted","Data":"447a44482c07136c3563632351627ce42e19ecddc6fcd3fade3598bc082c8ec0"} Mar 07 04:33:21 crc kubenswrapper[4689]: I0307 04:33:21.019239 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-5ps7l" podUID="83abe50d-3920-4550-af87-bd6b0d865e9d" containerName="registry-server" containerID="cri-o://39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f" gracePeriod=2 Mar 07 04:33:21 crc kubenswrapper[4689]: I0307 04:33:21.478608 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5ps7l" Mar 07 04:33:21 crc kubenswrapper[4689]: I0307 04:33:21.503092 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j46p5\" (UniqueName: \"kubernetes.io/projected/83abe50d-3920-4550-af87-bd6b0d865e9d-kube-api-access-j46p5\") pod \"83abe50d-3920-4550-af87-bd6b0d865e9d\" (UID: \"83abe50d-3920-4550-af87-bd6b0d865e9d\") " Mar 07 04:33:21 crc kubenswrapper[4689]: I0307 04:33:21.515403 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83abe50d-3920-4550-af87-bd6b0d865e9d-kube-api-access-j46p5" (OuterVolumeSpecName: "kube-api-access-j46p5") pod "83abe50d-3920-4550-af87-bd6b0d865e9d" (UID: "83abe50d-3920-4550-af87-bd6b0d865e9d"). InnerVolumeSpecName "kube-api-access-j46p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:33:21 crc kubenswrapper[4689]: I0307 04:33:21.604462 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j46p5\" (UniqueName: \"kubernetes.io/projected/83abe50d-3920-4550-af87-bd6b0d865e9d-kube-api-access-j46p5\") on node \"crc\" DevicePath \"\"" Mar 07 04:33:22 crc kubenswrapper[4689]: I0307 04:33:22.026351 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-rnf5j" event={"ID":"b07d06d3-554f-4c41-b001-e5d9338bbdf4","Type":"ContainerStarted","Data":"55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf"} Mar 07 04:33:22 crc kubenswrapper[4689]: I0307 04:33:22.027940 4689 generic.go:334] "Generic (PLEG): container finished" podID="83abe50d-3920-4550-af87-bd6b0d865e9d" containerID="39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f" exitCode=0 Mar 07 04:33:22 crc kubenswrapper[4689]: I0307 04:33:22.027970 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5ps7l" event={"ID":"83abe50d-3920-4550-af87-bd6b0d865e9d","Type":"ContainerDied","Data":"39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f"} Mar 07 04:33:22 crc kubenswrapper[4689]: I0307 04:33:22.027986 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5ps7l" event={"ID":"83abe50d-3920-4550-af87-bd6b0d865e9d","Type":"ContainerDied","Data":"8097b57f2d3f47cee729fa6f26175e76a5d2b57c858035232945789521d0bb05"} Mar 07 04:33:22 crc kubenswrapper[4689]: I0307 04:33:22.028005 4689 scope.go:117] "RemoveContainer" containerID="39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f" Mar 07 04:33:22 crc kubenswrapper[4689]: I0307 04:33:22.028059 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5ps7l" Mar 07 04:33:22 crc kubenswrapper[4689]: I0307 04:33:22.058815 4689 scope.go:117] "RemoveContainer" containerID="39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f" Mar 07 04:33:22 crc kubenswrapper[4689]: E0307 04:33:22.059481 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f\": container with ID starting with 39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f not found: ID does not exist" containerID="39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f" Mar 07 04:33:22 crc kubenswrapper[4689]: I0307 04:33:22.059560 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f"} err="failed to get container status \"39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f\": rpc error: code = NotFound desc = could not find container \"39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f\": container with ID starting with 39c5937fb479c2d317e1818156cdc12fcd314b6bb5b6404b6149d1e14252425f not found: ID does not exist" Mar 07 04:33:22 crc kubenswrapper[4689]: I0307 04:33:22.060550 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-rnf5j" podStartSLOduration=1.650920808 podStartE2EDuration="2.060517926s" podCreationTimestamp="2026-03-07 04:33:20 +0000 UTC" firstStartedPulling="2026-03-07 04:33:20.798312875 +0000 UTC m=+845.844696364" lastFinishedPulling="2026-03-07 04:33:21.207909983 +0000 UTC m=+846.254293482" observedRunningTime="2026-03-07 04:33:22.05315163 +0000 UTC m=+847.099535159" watchObservedRunningTime="2026-03-07 04:33:22.060517926 +0000 UTC m=+847.106901455" Mar 07 04:33:22 crc kubenswrapper[4689]: I0307 04:33:22.075303 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-5ps7l"] Mar 07 04:33:22 crc kubenswrapper[4689]: I0307 04:33:22.079881 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-5ps7l"] Mar 07 04:33:23 crc kubenswrapper[4689]: I0307 04:33:23.837697 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83abe50d-3920-4550-af87-bd6b0d865e9d" path="/var/lib/kubelet/pods/83abe50d-3920-4550-af87-bd6b0d865e9d/volumes" Mar 07 04:33:29 crc kubenswrapper[4689]: I0307 04:33:29.190320 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:33:29 crc kubenswrapper[4689]: I0307 04:33:29.190742 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:33:29 crc kubenswrapper[4689]: I0307 04:33:29.190821 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:33:29 crc kubenswrapper[4689]: I0307 04:33:29.191906 4689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c811faf449bec22216350a82fb0e4edb8efb6f32a1e999aafd915dabcad4588"} pod="openshift-machine-config-operator/machine-config-daemon-dss5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 04:33:29 crc kubenswrapper[4689]: I0307 04:33:29.192023 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" containerID="cri-o://9c811faf449bec22216350a82fb0e4edb8efb6f32a1e999aafd915dabcad4588" gracePeriod=600 Mar 07 04:33:30 crc kubenswrapper[4689]: I0307 04:33:30.104983 4689 generic.go:334] "Generic (PLEG): container finished" podID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerID="9c811faf449bec22216350a82fb0e4edb8efb6f32a1e999aafd915dabcad4588" exitCode=0 Mar 07 04:33:30 crc kubenswrapper[4689]: I0307 04:33:30.105665 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerDied","Data":"9c811faf449bec22216350a82fb0e4edb8efb6f32a1e999aafd915dabcad4588"} Mar 07 04:33:30 crc kubenswrapper[4689]: I0307 04:33:30.105731 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerStarted","Data":"095186d39ccb32197b5727728ef69f96ce62106ff83eff2af68654fa691615da"} Mar 07 04:33:30 crc kubenswrapper[4689]: I0307 04:33:30.105760 4689 scope.go:117] "RemoveContainer" containerID="b929a5d6764e60d2412d02d2d30426f108f4c1d195b3fd95c7435c02b959921b" Mar 07 04:33:30 crc kubenswrapper[4689]: I0307 04:33:30.543363 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-rnf5j" Mar 07 04:33:30 crc kubenswrapper[4689]: I0307 04:33:30.544949 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-rnf5j" Mar 07 04:33:30 crc kubenswrapper[4689]: I0307 04:33:30.591530 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-rnf5j" Mar 07 04:33:31 crc kubenswrapper[4689]: I0307 04:33:31.156803 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-rnf5j" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.025830 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h"] Mar 07 04:33:39 crc kubenswrapper[4689]: E0307 04:33:39.026574 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83abe50d-3920-4550-af87-bd6b0d865e9d" containerName="registry-server" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.026593 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="83abe50d-3920-4550-af87-bd6b0d865e9d" containerName="registry-server" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.026749 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="83abe50d-3920-4550-af87-bd6b0d865e9d" containerName="registry-server" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.027717 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.031385 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4j8gt" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.042466 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h"] Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.061084 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62285b0a-fa87-4b64-b313-62f820cc9467-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h\" (UID: \"62285b0a-fa87-4b64-b313-62f820cc9467\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.061143 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4njtl\" (UniqueName: \"kubernetes.io/projected/62285b0a-fa87-4b64-b313-62f820cc9467-kube-api-access-4njtl\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h\" (UID: \"62285b0a-fa87-4b64-b313-62f820cc9467\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.061172 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62285b0a-fa87-4b64-b313-62f820cc9467-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h\" (UID: \"62285b0a-fa87-4b64-b313-62f820cc9467\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.162781 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62285b0a-fa87-4b64-b313-62f820cc9467-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h\" (UID: \"62285b0a-fa87-4b64-b313-62f820cc9467\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.162846 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4njtl\" (UniqueName: \"kubernetes.io/projected/62285b0a-fa87-4b64-b313-62f820cc9467-kube-api-access-4njtl\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h\" (UID: \"62285b0a-fa87-4b64-b313-62f820cc9467\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.162876 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62285b0a-fa87-4b64-b313-62f820cc9467-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h\" (UID: \"62285b0a-fa87-4b64-b313-62f820cc9467\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.163267 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62285b0a-fa87-4b64-b313-62f820cc9467-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h\" (UID: \"62285b0a-fa87-4b64-b313-62f820cc9467\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.163279 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62285b0a-fa87-4b64-b313-62f820cc9467-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h\" (UID: \"62285b0a-fa87-4b64-b313-62f820cc9467\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.186530 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4njtl\" (UniqueName: \"kubernetes.io/projected/62285b0a-fa87-4b64-b313-62f820cc9467-kube-api-access-4njtl\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h\" (UID: \"62285b0a-fa87-4b64-b313-62f820cc9467\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.362941 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:39 crc kubenswrapper[4689]: I0307 04:33:39.887117 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h"] Mar 07 04:33:39 crc kubenswrapper[4689]: W0307 04:33:39.901104 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62285b0a_fa87_4b64_b313_62f820cc9467.slice/crio-db790d50db326e792ecdad12b7fdbeef0495f8406ee7d4fe2803e407ac8f0f0c WatchSource:0}: Error finding container db790d50db326e792ecdad12b7fdbeef0495f8406ee7d4fe2803e407ac8f0f0c: Status 404 returned error can't find the container with id db790d50db326e792ecdad12b7fdbeef0495f8406ee7d4fe2803e407ac8f0f0c Mar 07 04:33:40 crc kubenswrapper[4689]: I0307 04:33:40.180986 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" event={"ID":"62285b0a-fa87-4b64-b313-62f820cc9467","Type":"ContainerStarted","Data":"9cafcc6cec675f507108af2538822d2ec2e7bc27678e7e2758e3df9f8b5b19df"} Mar 07 04:33:40 crc kubenswrapper[4689]: I0307 04:33:40.181044 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" event={"ID":"62285b0a-fa87-4b64-b313-62f820cc9467","Type":"ContainerStarted","Data":"db790d50db326e792ecdad12b7fdbeef0495f8406ee7d4fe2803e407ac8f0f0c"} Mar 07 04:33:41 crc kubenswrapper[4689]: I0307 04:33:41.191901 4689 generic.go:334] "Generic (PLEG): container finished" podID="62285b0a-fa87-4b64-b313-62f820cc9467" containerID="9cafcc6cec675f507108af2538822d2ec2e7bc27678e7e2758e3df9f8b5b19df" exitCode=0 Mar 07 04:33:41 crc kubenswrapper[4689]: I0307 04:33:41.191945 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" event={"ID":"62285b0a-fa87-4b64-b313-62f820cc9467","Type":"ContainerDied","Data":"9cafcc6cec675f507108af2538822d2ec2e7bc27678e7e2758e3df9f8b5b19df"} Mar 07 04:33:42 crc kubenswrapper[4689]: I0307 04:33:42.198981 4689 generic.go:334] "Generic (PLEG): container finished" podID="62285b0a-fa87-4b64-b313-62f820cc9467" containerID="37a29d50dfdb5292360a814d6cc16d4e330cc2410daace4af257df61f7af8260" exitCode=0 Mar 07 04:33:42 crc kubenswrapper[4689]: I0307 04:33:42.199024 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" event={"ID":"62285b0a-fa87-4b64-b313-62f820cc9467","Type":"ContainerDied","Data":"37a29d50dfdb5292360a814d6cc16d4e330cc2410daace4af257df61f7af8260"} Mar 07 04:33:43 crc kubenswrapper[4689]: I0307 04:33:43.211519 4689 generic.go:334] "Generic (PLEG): container finished" podID="62285b0a-fa87-4b64-b313-62f820cc9467" containerID="e3269f3d9967a4c91ba27c543fa32589a9378c0078bea79b1a2bf3a9704a94bc" exitCode=0 Mar 07 04:33:43 crc kubenswrapper[4689]: I0307 04:33:43.211652 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" event={"ID":"62285b0a-fa87-4b64-b313-62f820cc9467","Type":"ContainerDied","Data":"e3269f3d9967a4c91ba27c543fa32589a9378c0078bea79b1a2bf3a9704a94bc"} Mar 07 04:33:44 crc kubenswrapper[4689]: I0307 04:33:44.437449 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:44 crc kubenswrapper[4689]: I0307 04:33:44.635347 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62285b0a-fa87-4b64-b313-62f820cc9467-util\") pod \"62285b0a-fa87-4b64-b313-62f820cc9467\" (UID: \"62285b0a-fa87-4b64-b313-62f820cc9467\") " Mar 07 04:33:44 crc kubenswrapper[4689]: I0307 04:33:44.635411 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62285b0a-fa87-4b64-b313-62f820cc9467-bundle\") pod \"62285b0a-fa87-4b64-b313-62f820cc9467\" (UID: \"62285b0a-fa87-4b64-b313-62f820cc9467\") " Mar 07 04:33:44 crc kubenswrapper[4689]: I0307 04:33:44.635451 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4njtl\" (UniqueName: \"kubernetes.io/projected/62285b0a-fa87-4b64-b313-62f820cc9467-kube-api-access-4njtl\") pod \"62285b0a-fa87-4b64-b313-62f820cc9467\" (UID: \"62285b0a-fa87-4b64-b313-62f820cc9467\") " Mar 07 04:33:44 crc kubenswrapper[4689]: I0307 04:33:44.638002 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62285b0a-fa87-4b64-b313-62f820cc9467-bundle" (OuterVolumeSpecName: "bundle") pod "62285b0a-fa87-4b64-b313-62f820cc9467" (UID: "62285b0a-fa87-4b64-b313-62f820cc9467"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:33:44 crc kubenswrapper[4689]: I0307 04:33:44.646381 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62285b0a-fa87-4b64-b313-62f820cc9467-kube-api-access-4njtl" (OuterVolumeSpecName: "kube-api-access-4njtl") pod "62285b0a-fa87-4b64-b313-62f820cc9467" (UID: "62285b0a-fa87-4b64-b313-62f820cc9467"). InnerVolumeSpecName "kube-api-access-4njtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:33:44 crc kubenswrapper[4689]: I0307 04:33:44.667315 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62285b0a-fa87-4b64-b313-62f820cc9467-util" (OuterVolumeSpecName: "util") pod "62285b0a-fa87-4b64-b313-62f820cc9467" (UID: "62285b0a-fa87-4b64-b313-62f820cc9467"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:33:44 crc kubenswrapper[4689]: I0307 04:33:44.737011 4689 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62285b0a-fa87-4b64-b313-62f820cc9467-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:33:44 crc kubenswrapper[4689]: I0307 04:33:44.737104 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4njtl\" (UniqueName: \"kubernetes.io/projected/62285b0a-fa87-4b64-b313-62f820cc9467-kube-api-access-4njtl\") on node \"crc\" DevicePath \"\"" Mar 07 04:33:44 crc kubenswrapper[4689]: I0307 04:33:44.737125 4689 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62285b0a-fa87-4b64-b313-62f820cc9467-util\") on node \"crc\" DevicePath \"\"" Mar 07 04:33:45 crc kubenswrapper[4689]: I0307 04:33:45.228344 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" event={"ID":"62285b0a-fa87-4b64-b313-62f820cc9467","Type":"ContainerDied","Data":"db790d50db326e792ecdad12b7fdbeef0495f8406ee7d4fe2803e407ac8f0f0c"} Mar 07 04:33:45 crc kubenswrapper[4689]: I0307 04:33:45.228409 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db790d50db326e792ecdad12b7fdbeef0495f8406ee7d4fe2803e407ac8f0f0c" Mar 07 04:33:45 crc kubenswrapper[4689]: I0307 04:33:45.228454 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.697203 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4"] Mar 07 04:33:55 crc kubenswrapper[4689]: E0307 04:33:55.698004 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62285b0a-fa87-4b64-b313-62f820cc9467" containerName="util" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.698018 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="62285b0a-fa87-4b64-b313-62f820cc9467" containerName="util" Mar 07 04:33:55 crc kubenswrapper[4689]: E0307 04:33:55.698035 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62285b0a-fa87-4b64-b313-62f820cc9467" containerName="pull" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.698043 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="62285b0a-fa87-4b64-b313-62f820cc9467" containerName="pull" Mar 07 04:33:55 crc kubenswrapper[4689]: E0307 04:33:55.698062 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62285b0a-fa87-4b64-b313-62f820cc9467" containerName="extract" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.698069 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="62285b0a-fa87-4b64-b313-62f820cc9467" containerName="extract" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.698217 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="62285b0a-fa87-4b64-b313-62f820cc9467" containerName="extract" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.698695 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.700718 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.701344 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vgkvl" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.716850 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4"] Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.884839 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d885l\" (UniqueName: \"kubernetes.io/projected/9edd6ad0-247d-45f0-95e9-0291d649c6ec-kube-api-access-d885l\") pod \"infra-operator-controller-manager-9b75f4d4d-869m4\" (UID: \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\") " pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.884977 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9edd6ad0-247d-45f0-95e9-0291d649c6ec-apiservice-cert\") pod \"infra-operator-controller-manager-9b75f4d4d-869m4\" (UID: \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\") " pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.885117 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9edd6ad0-247d-45f0-95e9-0291d649c6ec-webhook-cert\") pod \"infra-operator-controller-manager-9b75f4d4d-869m4\" (UID: \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\") " pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.986744 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9edd6ad0-247d-45f0-95e9-0291d649c6ec-webhook-cert\") pod \"infra-operator-controller-manager-9b75f4d4d-869m4\" (UID: \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\") " pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.986783 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d885l\" (UniqueName: \"kubernetes.io/projected/9edd6ad0-247d-45f0-95e9-0291d649c6ec-kube-api-access-d885l\") pod \"infra-operator-controller-manager-9b75f4d4d-869m4\" (UID: \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\") " pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.986816 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9edd6ad0-247d-45f0-95e9-0291d649c6ec-apiservice-cert\") pod \"infra-operator-controller-manager-9b75f4d4d-869m4\" (UID: \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\") " pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:33:55 crc kubenswrapper[4689]: I0307 04:33:55.996443 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9edd6ad0-247d-45f0-95e9-0291d649c6ec-webhook-cert\") pod \"infra-operator-controller-manager-9b75f4d4d-869m4\" (UID: \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\") " pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:33:56 crc kubenswrapper[4689]: I0307 04:33:56.001519 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9edd6ad0-247d-45f0-95e9-0291d649c6ec-apiservice-cert\") pod \"infra-operator-controller-manager-9b75f4d4d-869m4\" (UID: \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\") " pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:33:56 crc kubenswrapper[4689]: I0307 04:33:56.017381 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d885l\" (UniqueName: \"kubernetes.io/projected/9edd6ad0-247d-45f0-95e9-0291d649c6ec-kube-api-access-d885l\") pod \"infra-operator-controller-manager-9b75f4d4d-869m4\" (UID: \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\") " pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:33:56 crc kubenswrapper[4689]: I0307 04:33:56.018300 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:33:56 crc kubenswrapper[4689]: I0307 04:33:56.272856 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4"] Mar 07 04:33:56 crc kubenswrapper[4689]: W0307 04:33:56.293069 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9edd6ad0_247d_45f0_95e9_0291d649c6ec.slice/crio-d1bfc89c10713d5e6c4965888b82c56529c05dd4bb152d98f8d99bef100c9bc4 WatchSource:0}: Error finding container d1bfc89c10713d5e6c4965888b82c56529c05dd4bb152d98f8d99bef100c9bc4: Status 404 returned error can't find the container with id d1bfc89c10713d5e6c4965888b82c56529c05dd4bb152d98f8d99bef100c9bc4 Mar 07 04:33:56 crc kubenswrapper[4689]: I0307 04:33:56.302254 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" event={"ID":"9edd6ad0-247d-45f0-95e9-0291d649c6ec","Type":"ContainerStarted","Data":"d1bfc89c10713d5e6c4965888b82c56529c05dd4bb152d98f8d99bef100c9bc4"} Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.556141 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.557607 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.563235 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.563615 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.564147 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-4lnkm" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.569193 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.572836 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.574046 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.591878 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.594162 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.605777 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.606902 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.620248 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.628976 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.722749 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.722888 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26e0bab4-0913-4193-bb07-8d1802eda6c0-config-data-generated\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.722914 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/af13923a-66fb-409e-a32e-42b1837151fe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.722969 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-operator-scripts\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723007 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723063 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723103 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-operator-scripts\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723275 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbx2f\" (UniqueName: \"kubernetes.io/projected/af13923a-66fb-409e-a32e-42b1837151fe-kube-api-access-gbx2f\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723308 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/243ddc02-c377-44ac-9b47-2240c3d9efed-config-data-generated\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723330 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-config-data-default\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723348 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-kolla-config\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723380 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-kolla-config\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723408 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9w2p\" (UniqueName: \"kubernetes.io/projected/26e0bab4-0913-4193-bb07-8d1802eda6c0-kube-api-access-w9w2p\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723428 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-config-data-default\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723448 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqsp8\" (UniqueName: \"kubernetes.io/projected/243ddc02-c377-44ac-9b47-2240c3d9efed-kube-api-access-nqsp8\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723490 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723511 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-config-data-default\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.723533 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-kolla-config\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.824997 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbx2f\" (UniqueName: \"kubernetes.io/projected/af13923a-66fb-409e-a32e-42b1837151fe-kube-api-access-gbx2f\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825044 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/243ddc02-c377-44ac-9b47-2240c3d9efed-config-data-generated\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825063 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-config-data-default\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825078 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-kolla-config\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825102 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-kolla-config\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825119 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9w2p\" (UniqueName: \"kubernetes.io/projected/26e0bab4-0913-4193-bb07-8d1802eda6c0-kube-api-access-w9w2p\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825133 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-config-data-default\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825150 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqsp8\" (UniqueName: \"kubernetes.io/projected/243ddc02-c377-44ac-9b47-2240c3d9efed-kube-api-access-nqsp8\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825204 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825225 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-config-data-default\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825241 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-kolla-config\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825259 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825273 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26e0bab4-0913-4193-bb07-8d1802eda6c0-config-data-generated\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825287 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/af13923a-66fb-409e-a32e-42b1837151fe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825308 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-operator-scripts\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825327 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825349 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825370 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-operator-scripts\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825526 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/243ddc02-c377-44ac-9b47-2240c3d9efed-config-data-generated\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825943 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-kolla-config\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.826046 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.826113 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.826161 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26e0bab4-0913-4193-bb07-8d1802eda6c0-config-data-generated\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.826262 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/af13923a-66fb-409e-a32e-42b1837151fe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.826312 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-config-data-default\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.826047 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-kolla-config\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.825944 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.826806 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-kolla-config\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.826924 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-config-data-default\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.826944 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-operator-scripts\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.827357 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-operator-scripts\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.828125 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.830780 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-config-data-default\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.843929 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9w2p\" (UniqueName: \"kubernetes.io/projected/26e0bab4-0913-4193-bb07-8d1802eda6c0-kube-api-access-w9w2p\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.844515 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.845588 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.848150 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbx2f\" (UniqueName: \"kubernetes.io/projected/af13923a-66fb-409e-a32e-42b1837151fe-kube-api-access-gbx2f\") pod \"openstack-galera-0\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.849023 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqsp8\" (UniqueName: \"kubernetes.io/projected/243ddc02-c377-44ac-9b47-2240c3d9efed-kube-api-access-nqsp8\") pod \"openstack-galera-2\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.854321 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-1\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.893685 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.913294 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:33:58 crc kubenswrapper[4689]: I0307 04:33:58.932325 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:33:59 crc kubenswrapper[4689]: I0307 04:33:59.121472 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Mar 07 04:33:59 crc kubenswrapper[4689]: I0307 04:33:59.322863 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"af13923a-66fb-409e-a32e-42b1837151fe","Type":"ContainerStarted","Data":"1c184d30f5f37ca58e2bf0d51c5ce53307e7afbcc6f63afedaf1fde55a8c6acf"} Mar 07 04:33:59 crc kubenswrapper[4689]: I0307 04:33:59.324989 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" event={"ID":"9edd6ad0-247d-45f0-95e9-0291d649c6ec","Type":"ContainerStarted","Data":"fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d"} Mar 07 04:33:59 crc kubenswrapper[4689]: I0307 04:33:59.325311 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:33:59 crc kubenswrapper[4689]: I0307 04:33:59.358526 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" podStartSLOduration=2.223291273 podStartE2EDuration="4.358513837s" podCreationTimestamp="2026-03-07 04:33:55 +0000 UTC" firstStartedPulling="2026-03-07 04:33:56.297312504 +0000 UTC m=+881.343696003" lastFinishedPulling="2026-03-07 04:33:58.432535058 +0000 UTC m=+883.478918567" observedRunningTime="2026-03-07 04:33:59.357771167 +0000 UTC m=+884.404154656" watchObservedRunningTime="2026-03-07 04:33:59.358513837 +0000 UTC m=+884.404897326" Mar 07 04:33:59 crc kubenswrapper[4689]: W0307 04:33:59.412072 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26e0bab4_0913_4193_bb07_8d1802eda6c0.slice/crio-48ed228fa3b2684e6f8d7ecf1f94f3cf99c2a0bfd42d5d5e5aadc92f1298682c WatchSource:0}: Error finding container 48ed228fa3b2684e6f8d7ecf1f94f3cf99c2a0bfd42d5d5e5aadc92f1298682c: Status 404 returned error can't find the container with id 48ed228fa3b2684e6f8d7ecf1f94f3cf99c2a0bfd42d5d5e5aadc92f1298682c Mar 07 04:33:59 crc kubenswrapper[4689]: I0307 04:33:59.413073 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Mar 07 04:33:59 crc kubenswrapper[4689]: I0307 04:33:59.423596 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Mar 07 04:33:59 crc kubenswrapper[4689]: W0307 04:33:59.434127 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod243ddc02_c377_44ac_9b47_2240c3d9efed.slice/crio-7947c3d03fcf0132f31811227f3583400e6eea68bf29f2a0ee53df51b711243f WatchSource:0}: Error finding container 7947c3d03fcf0132f31811227f3583400e6eea68bf29f2a0ee53df51b711243f: Status 404 returned error can't find the container with id 7947c3d03fcf0132f31811227f3583400e6eea68bf29f2a0ee53df51b711243f Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.136034 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547634-8rcjv"] Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.136982 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547634-8rcjv"] Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.137048 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547634-8rcjv" Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.139076 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.139137 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.139362 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.248868 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxxxz\" (UniqueName: \"kubernetes.io/projected/f6d21cd9-9297-44bf-8680-246a190f3110-kube-api-access-fxxxz\") pod \"auto-csr-approver-29547634-8rcjv\" (UID: \"f6d21cd9-9297-44bf-8680-246a190f3110\") " pod="openshift-infra/auto-csr-approver-29547634-8rcjv" Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.335515 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"243ddc02-c377-44ac-9b47-2240c3d9efed","Type":"ContainerStarted","Data":"7947c3d03fcf0132f31811227f3583400e6eea68bf29f2a0ee53df51b711243f"} Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.336756 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"26e0bab4-0913-4193-bb07-8d1802eda6c0","Type":"ContainerStarted","Data":"48ed228fa3b2684e6f8d7ecf1f94f3cf99c2a0bfd42d5d5e5aadc92f1298682c"} Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.350808 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxxxz\" (UniqueName: \"kubernetes.io/projected/f6d21cd9-9297-44bf-8680-246a190f3110-kube-api-access-fxxxz\") pod \"auto-csr-approver-29547634-8rcjv\" (UID: \"f6d21cd9-9297-44bf-8680-246a190f3110\") " pod="openshift-infra/auto-csr-approver-29547634-8rcjv" Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.369732 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxxxz\" (UniqueName: \"kubernetes.io/projected/f6d21cd9-9297-44bf-8680-246a190f3110-kube-api-access-fxxxz\") pod \"auto-csr-approver-29547634-8rcjv\" (UID: \"f6d21cd9-9297-44bf-8680-246a190f3110\") " pod="openshift-infra/auto-csr-approver-29547634-8rcjv" Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.464410 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547634-8rcjv" Mar 07 04:34:00 crc kubenswrapper[4689]: I0307 04:34:00.963560 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547634-8rcjv"] Mar 07 04:34:00 crc kubenswrapper[4689]: W0307 04:34:00.977002 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6d21cd9_9297_44bf_8680_246a190f3110.slice/crio-1075defdebe4c05f666edaa820eac89d3bfda65b6e6d61b3253e171cc9e3925b WatchSource:0}: Error finding container 1075defdebe4c05f666edaa820eac89d3bfda65b6e6d61b3253e171cc9e3925b: Status 404 returned error can't find the container with id 1075defdebe4c05f666edaa820eac89d3bfda65b6e6d61b3253e171cc9e3925b Mar 07 04:34:01 crc kubenswrapper[4689]: I0307 04:34:01.345098 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547634-8rcjv" event={"ID":"f6d21cd9-9297-44bf-8680-246a190f3110","Type":"ContainerStarted","Data":"1075defdebe4c05f666edaa820eac89d3bfda65b6e6d61b3253e171cc9e3925b"} Mar 07 04:34:06 crc kubenswrapper[4689]: I0307 04:34:06.024729 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:34:10 crc kubenswrapper[4689]: I0307 04:34:10.441993 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"26e0bab4-0913-4193-bb07-8d1802eda6c0","Type":"ContainerStarted","Data":"815152161e0a7dded51afa61b167ea2c378366e12438620ee58b3bfcceae4ed6"} Mar 07 04:34:10 crc kubenswrapper[4689]: I0307 04:34:10.444002 4689 generic.go:334] "Generic (PLEG): container finished" podID="f6d21cd9-9297-44bf-8680-246a190f3110" containerID="d8d5ff449879a4e7c2e212532eda9528345078d1c13fa4d42b32b9082385d81e" exitCode=0 Mar 07 04:34:10 crc kubenswrapper[4689]: I0307 04:34:10.444057 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547634-8rcjv" event={"ID":"f6d21cd9-9297-44bf-8680-246a190f3110","Type":"ContainerDied","Data":"d8d5ff449879a4e7c2e212532eda9528345078d1c13fa4d42b32b9082385d81e"} Mar 07 04:34:10 crc kubenswrapper[4689]: I0307 04:34:10.445540 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"af13923a-66fb-409e-a32e-42b1837151fe","Type":"ContainerStarted","Data":"9964af22e9987fe53552277762e5029819ae340630713a219d90389162debb35"} Mar 07 04:34:10 crc kubenswrapper[4689]: I0307 04:34:10.447039 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"243ddc02-c377-44ac-9b47-2240c3d9efed","Type":"ContainerStarted","Data":"dc21aff73e3d3b1f30474aa95eb0c7b2ae6dc404bcf830c806fc5312775819fc"} Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.522500 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/memcached-0"] Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.523456 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.537962 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-snrll" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.538045 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.543116 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.739867 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c41b2833-be4f-46a8-b1fb-7c244ac8530b-config-data\") pod \"memcached-0\" (UID: \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\") " pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.740253 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spvnz\" (UniqueName: \"kubernetes.io/projected/c41b2833-be4f-46a8-b1fb-7c244ac8530b-kube-api-access-spvnz\") pod \"memcached-0\" (UID: \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\") " pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.740278 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c41b2833-be4f-46a8-b1fb-7c244ac8530b-kolla-config\") pod \"memcached-0\" (UID: \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\") " pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.830017 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547634-8rcjv" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.842092 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c41b2833-be4f-46a8-b1fb-7c244ac8530b-config-data\") pod \"memcached-0\" (UID: \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\") " pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.842213 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spvnz\" (UniqueName: \"kubernetes.io/projected/c41b2833-be4f-46a8-b1fb-7c244ac8530b-kube-api-access-spvnz\") pod \"memcached-0\" (UID: \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\") " pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.842243 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c41b2833-be4f-46a8-b1fb-7c244ac8530b-kolla-config\") pod \"memcached-0\" (UID: \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\") " pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.843079 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c41b2833-be4f-46a8-b1fb-7c244ac8530b-kolla-config\") pod \"memcached-0\" (UID: \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\") " pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.847778 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c41b2833-be4f-46a8-b1fb-7c244ac8530b-config-data\") pod \"memcached-0\" (UID: \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\") " pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.863929 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spvnz\" (UniqueName: \"kubernetes.io/projected/c41b2833-be4f-46a8-b1fb-7c244ac8530b-kube-api-access-spvnz\") pod \"memcached-0\" (UID: \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\") " pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.893573 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.946909 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxxxz\" (UniqueName: \"kubernetes.io/projected/f6d21cd9-9297-44bf-8680-246a190f3110-kube-api-access-fxxxz\") pod \"f6d21cd9-9297-44bf-8680-246a190f3110\" (UID: \"f6d21cd9-9297-44bf-8680-246a190f3110\") " Mar 07 04:34:11 crc kubenswrapper[4689]: I0307 04:34:11.952378 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d21cd9-9297-44bf-8680-246a190f3110-kube-api-access-fxxxz" (OuterVolumeSpecName: "kube-api-access-fxxxz") pod "f6d21cd9-9297-44bf-8680-246a190f3110" (UID: "f6d21cd9-9297-44bf-8680-246a190f3110"). InnerVolumeSpecName "kube-api-access-fxxxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.047843 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxxxz\" (UniqueName: \"kubernetes.io/projected/f6d21cd9-9297-44bf-8680-246a190f3110-kube-api-access-fxxxz\") on node \"crc\" DevicePath \"\"" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.114092 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-rmhzb"] Mar 07 04:34:12 crc kubenswrapper[4689]: E0307 04:34:12.114362 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d21cd9-9297-44bf-8680-246a190f3110" containerName="oc" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.114377 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d21cd9-9297-44bf-8680-246a190f3110" containerName="oc" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.114494 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d21cd9-9297-44bf-8680-246a190f3110" containerName="oc" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.114865 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.145123 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-7vlj6" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.149296 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsmmf\" (UniqueName: \"kubernetes.io/projected/7d707a8a-1e9b-4520-88d9-ba6c22dc71f0-kube-api-access-fsmmf\") pod \"rabbitmq-cluster-operator-index-rmhzb\" (UID: \"7d707a8a-1e9b-4520-88d9-ba6c22dc71f0\") " pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.154882 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-rmhzb"] Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.249957 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.250104 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsmmf\" (UniqueName: \"kubernetes.io/projected/7d707a8a-1e9b-4520-88d9-ba6c22dc71f0-kube-api-access-fsmmf\") pod \"rabbitmq-cluster-operator-index-rmhzb\" (UID: \"7d707a8a-1e9b-4520-88d9-ba6c22dc71f0\") " pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.305993 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsmmf\" (UniqueName: \"kubernetes.io/projected/7d707a8a-1e9b-4520-88d9-ba6c22dc71f0-kube-api-access-fsmmf\") pod \"rabbitmq-cluster-operator-index-rmhzb\" (UID: \"7d707a8a-1e9b-4520-88d9-ba6c22dc71f0\") " pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.445247 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.456431 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547634-8rcjv" event={"ID":"f6d21cd9-9297-44bf-8680-246a190f3110","Type":"ContainerDied","Data":"1075defdebe4c05f666edaa820eac89d3bfda65b6e6d61b3253e171cc9e3925b"} Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.456467 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1075defdebe4c05f666edaa820eac89d3bfda65b6e6d61b3253e171cc9e3925b" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.456508 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547634-8rcjv" Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.457791 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"c41b2833-be4f-46a8-b1fb-7c244ac8530b","Type":"ContainerStarted","Data":"a4b54915837ec35d4b36960643c7576d6ea2b7df400722e9058394c5c654e041"} Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.769375 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-rmhzb"] Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.890892 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547628-zf96k"] Mar 07 04:34:12 crc kubenswrapper[4689]: I0307 04:34:12.894088 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547628-zf96k"] Mar 07 04:34:13 crc kubenswrapper[4689]: I0307 04:34:13.466665 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" event={"ID":"7d707a8a-1e9b-4520-88d9-ba6c22dc71f0","Type":"ContainerStarted","Data":"c51bcab9b94bbea650f8b639a9277c90465ec0ff7da71b8e5a5bdf0b0cedbacc"} Mar 07 04:34:14 crc kubenswrapper[4689]: I0307 04:34:14.518800 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c62f147-723a-420a-b75b-efe6a8585eb9" path="/var/lib/kubelet/pods/3c62f147-723a-420a-b75b-efe6a8585eb9/volumes" Mar 07 04:34:15 crc kubenswrapper[4689]: I0307 04:34:15.525572 4689 generic.go:334] "Generic (PLEG): container finished" podID="af13923a-66fb-409e-a32e-42b1837151fe" containerID="9964af22e9987fe53552277762e5029819ae340630713a219d90389162debb35" exitCode=0 Mar 07 04:34:15 crc kubenswrapper[4689]: I0307 04:34:15.525653 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"af13923a-66fb-409e-a32e-42b1837151fe","Type":"ContainerDied","Data":"9964af22e9987fe53552277762e5029819ae340630713a219d90389162debb35"} Mar 07 04:34:15 crc kubenswrapper[4689]: I0307 04:34:15.531101 4689 generic.go:334] "Generic (PLEG): container finished" podID="243ddc02-c377-44ac-9b47-2240c3d9efed" containerID="dc21aff73e3d3b1f30474aa95eb0c7b2ae6dc404bcf830c806fc5312775819fc" exitCode=0 Mar 07 04:34:15 crc kubenswrapper[4689]: I0307 04:34:15.531201 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"243ddc02-c377-44ac-9b47-2240c3d9efed","Type":"ContainerDied","Data":"dc21aff73e3d3b1f30474aa95eb0c7b2ae6dc404bcf830c806fc5312775819fc"} Mar 07 04:34:15 crc kubenswrapper[4689]: I0307 04:34:15.532907 4689 generic.go:334] "Generic (PLEG): container finished" podID="26e0bab4-0913-4193-bb07-8d1802eda6c0" containerID="815152161e0a7dded51afa61b167ea2c378366e12438620ee58b3bfcceae4ed6" exitCode=0 Mar 07 04:34:15 crc kubenswrapper[4689]: I0307 04:34:15.532938 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"26e0bab4-0913-4193-bb07-8d1802eda6c0","Type":"ContainerDied","Data":"815152161e0a7dded51afa61b167ea2c378366e12438620ee58b3bfcceae4ed6"} Mar 07 04:34:16 crc kubenswrapper[4689]: I0307 04:34:16.303488 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-rmhzb"] Mar 07 04:34:16 crc kubenswrapper[4689]: I0307 04:34:16.542556 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"243ddc02-c377-44ac-9b47-2240c3d9efed","Type":"ContainerStarted","Data":"78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de"} Mar 07 04:34:16 crc kubenswrapper[4689]: I0307 04:34:16.545682 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"26e0bab4-0913-4193-bb07-8d1802eda6c0","Type":"ContainerStarted","Data":"a15b111de1ac2c83ca80e44c5c4ff7f0530be43f7923b13afab7da027b126d2f"} Mar 07 04:34:16 crc kubenswrapper[4689]: I0307 04:34:16.564013 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-2" podStartSLOduration=9.583329782 podStartE2EDuration="19.563997748s" podCreationTimestamp="2026-03-07 04:33:57 +0000 UTC" firstStartedPulling="2026-03-07 04:33:59.438568921 +0000 UTC m=+884.484952410" lastFinishedPulling="2026-03-07 04:34:09.419236877 +0000 UTC m=+894.465620376" observedRunningTime="2026-03-07 04:34:16.559907129 +0000 UTC m=+901.606290618" watchObservedRunningTime="2026-03-07 04:34:16.563997748 +0000 UTC m=+901.610381237" Mar 07 04:34:16 crc kubenswrapper[4689]: I0307 04:34:16.581448 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-1" podStartSLOduration=9.499078577 podStartE2EDuration="19.58143124s" podCreationTimestamp="2026-03-07 04:33:57 +0000 UTC" firstStartedPulling="2026-03-07 04:33:59.414219845 +0000 UTC m=+884.460603374" lastFinishedPulling="2026-03-07 04:34:09.496572538 +0000 UTC m=+894.542956037" observedRunningTime="2026-03-07 04:34:16.580419994 +0000 UTC m=+901.626803483" watchObservedRunningTime="2026-03-07 04:34:16.58143124 +0000 UTC m=+901.627814719" Mar 07 04:34:16 crc kubenswrapper[4689]: I0307 04:34:16.905901 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-mtqqp"] Mar 07 04:34:16 crc kubenswrapper[4689]: I0307 04:34:16.906744 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" Mar 07 04:34:16 crc kubenswrapper[4689]: I0307 04:34:16.909733 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-mtqqp"] Mar 07 04:34:17 crc kubenswrapper[4689]: I0307 04:34:17.009108 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgp79\" (UniqueName: \"kubernetes.io/projected/05c84021-6716-4d39-ab89-1cea45f77a64-kube-api-access-fgp79\") pod \"rabbitmq-cluster-operator-index-mtqqp\" (UID: \"05c84021-6716-4d39-ab89-1cea45f77a64\") " pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" Mar 07 04:34:17 crc kubenswrapper[4689]: I0307 04:34:17.110994 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgp79\" (UniqueName: \"kubernetes.io/projected/05c84021-6716-4d39-ab89-1cea45f77a64-kube-api-access-fgp79\") pod \"rabbitmq-cluster-operator-index-mtqqp\" (UID: \"05c84021-6716-4d39-ab89-1cea45f77a64\") " pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" Mar 07 04:34:17 crc kubenswrapper[4689]: I0307 04:34:17.147573 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgp79\" (UniqueName: \"kubernetes.io/projected/05c84021-6716-4d39-ab89-1cea45f77a64-kube-api-access-fgp79\") pod \"rabbitmq-cluster-operator-index-mtqqp\" (UID: \"05c84021-6716-4d39-ab89-1cea45f77a64\") " pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" Mar 07 04:34:17 crc kubenswrapper[4689]: I0307 04:34:17.241241 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" Mar 07 04:34:17 crc kubenswrapper[4689]: I0307 04:34:17.555177 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"af13923a-66fb-409e-a32e-42b1837151fe","Type":"ContainerStarted","Data":"aa6e9e5b81519a2444b9b02197944b8195cde24c07f3c66e98b50d857210adf7"} Mar 07 04:34:17 crc kubenswrapper[4689]: I0307 04:34:17.582758 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-0" podStartSLOduration=10.264134677 podStartE2EDuration="20.582736678s" podCreationTimestamp="2026-03-07 04:33:57 +0000 UTC" firstStartedPulling="2026-03-07 04:33:59.134550985 +0000 UTC m=+884.180934474" lastFinishedPulling="2026-03-07 04:34:09.453152976 +0000 UTC m=+894.499536475" observedRunningTime="2026-03-07 04:34:17.570672688 +0000 UTC m=+902.617056167" watchObservedRunningTime="2026-03-07 04:34:17.582736678 +0000 UTC m=+902.629120177" Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.292442 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-mtqqp"] Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.564193 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" event={"ID":"05c84021-6716-4d39-ab89-1cea45f77a64","Type":"ContainerStarted","Data":"3accee3638adb30f1fa6296babcd1d47f10edc16066a2e2584670f0a7faa0332"} Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.566706 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"c41b2833-be4f-46a8-b1fb-7c244ac8530b","Type":"ContainerStarted","Data":"53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96"} Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.566823 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.569876 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" event={"ID":"7d707a8a-1e9b-4520-88d9-ba6c22dc71f0","Type":"ContainerStarted","Data":"a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106"} Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.570079 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" podUID="7d707a8a-1e9b-4520-88d9-ba6c22dc71f0" containerName="registry-server" containerID="cri-o://a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106" gracePeriod=2 Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.600375 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/memcached-0" podStartSLOduration=3.609253033 podStartE2EDuration="7.600362358s" podCreationTimestamp="2026-03-07 04:34:11 +0000 UTC" firstStartedPulling="2026-03-07 04:34:12.256623941 +0000 UTC m=+897.303007420" lastFinishedPulling="2026-03-07 04:34:16.247733246 +0000 UTC m=+901.294116745" observedRunningTime="2026-03-07 04:34:18.595660413 +0000 UTC m=+903.642043912" watchObservedRunningTime="2026-03-07 04:34:18.600362358 +0000 UTC m=+903.646745857" Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.628450 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" podStartSLOduration=1.531074076 podStartE2EDuration="6.628427192s" podCreationTimestamp="2026-03-07 04:34:12 +0000 UTC" firstStartedPulling="2026-03-07 04:34:12.794700508 +0000 UTC m=+897.841083997" lastFinishedPulling="2026-03-07 04:34:17.892053604 +0000 UTC m=+902.938437113" observedRunningTime="2026-03-07 04:34:18.623353658 +0000 UTC m=+903.669737157" watchObservedRunningTime="2026-03-07 04:34:18.628427192 +0000 UTC m=+903.674810691" Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.894460 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.895721 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.914945 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.914987 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.933048 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:34:18 crc kubenswrapper[4689]: I0307 04:34:18.933092 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.083376 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.141035 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsmmf\" (UniqueName: \"kubernetes.io/projected/7d707a8a-1e9b-4520-88d9-ba6c22dc71f0-kube-api-access-fsmmf\") pod \"7d707a8a-1e9b-4520-88d9-ba6c22dc71f0\" (UID: \"7d707a8a-1e9b-4520-88d9-ba6c22dc71f0\") " Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.146243 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d707a8a-1e9b-4520-88d9-ba6c22dc71f0-kube-api-access-fsmmf" (OuterVolumeSpecName: "kube-api-access-fsmmf") pod "7d707a8a-1e9b-4520-88d9-ba6c22dc71f0" (UID: "7d707a8a-1e9b-4520-88d9-ba6c22dc71f0"). InnerVolumeSpecName "kube-api-access-fsmmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.242830 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsmmf\" (UniqueName: \"kubernetes.io/projected/7d707a8a-1e9b-4520-88d9-ba6c22dc71f0-kube-api-access-fsmmf\") on node \"crc\" DevicePath \"\"" Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.581676 4689 generic.go:334] "Generic (PLEG): container finished" podID="7d707a8a-1e9b-4520-88d9-ba6c22dc71f0" containerID="a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106" exitCode=0 Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.581785 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" event={"ID":"7d707a8a-1e9b-4520-88d9-ba6c22dc71f0","Type":"ContainerDied","Data":"a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106"} Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.581846 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" event={"ID":"7d707a8a-1e9b-4520-88d9-ba6c22dc71f0","Type":"ContainerDied","Data":"c51bcab9b94bbea650f8b639a9277c90465ec0ff7da71b8e5a5bdf0b0cedbacc"} Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.581877 4689 scope.go:117] "RemoveContainer" containerID="a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106" Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.582130 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-rmhzb" Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.584801 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" event={"ID":"05c84021-6716-4d39-ab89-1cea45f77a64","Type":"ContainerStarted","Data":"8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247"} Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.611782 4689 scope.go:117] "RemoveContainer" containerID="a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106" Mar 07 04:34:19 crc kubenswrapper[4689]: E0307 04:34:19.612899 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106\": container with ID starting with a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106 not found: ID does not exist" containerID="a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106" Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.612964 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106"} err="failed to get container status \"a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106\": rpc error: code = NotFound desc = could not find container \"a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106\": container with ID starting with a6c99bb123452f8a19cdd2e1320fbdc6415e1ee1c0d38661b0440407684be106 not found: ID does not exist" Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.614674 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" podStartSLOduration=3.167498756 podStartE2EDuration="3.614652s" podCreationTimestamp="2026-03-07 04:34:16 +0000 UTC" firstStartedPulling="2026-03-07 04:34:18.303595384 +0000 UTC m=+903.349978873" lastFinishedPulling="2026-03-07 04:34:18.750748618 +0000 UTC m=+903.797132117" observedRunningTime="2026-03-07 04:34:19.607895841 +0000 UTC m=+904.654279340" watchObservedRunningTime="2026-03-07 04:34:19.614652 +0000 UTC m=+904.661035529" Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.640299 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-rmhzb"] Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.645029 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-rmhzb"] Mar 07 04:34:19 crc kubenswrapper[4689]: I0307 04:34:19.839895 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d707a8a-1e9b-4520-88d9-ba6c22dc71f0" path="/var/lib/kubelet/pods/7d707a8a-1e9b-4520-88d9-ba6c22dc71f0/volumes" Mar 07 04:34:25 crc kubenswrapper[4689]: I0307 04:34:25.035939 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:34:25 crc kubenswrapper[4689]: I0307 04:34:25.141752 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:34:25 crc kubenswrapper[4689]: E0307 04:34:25.340526 4689 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.151:35406->38.102.83.151:43475: write tcp 38.102.83.151:35406->38.102.83.151:43475: write: broken pipe Mar 07 04:34:26 crc kubenswrapper[4689]: I0307 04:34:26.895323 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/memcached-0" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.242112 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.242295 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.282440 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.632807 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/root-account-create-update-ndcqp"] Mar 07 04:34:27 crc kubenswrapper[4689]: E0307 04:34:27.633070 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d707a8a-1e9b-4520-88d9-ba6c22dc71f0" containerName="registry-server" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.633084 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d707a8a-1e9b-4520-88d9-ba6c22dc71f0" containerName="registry-server" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.633230 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d707a8a-1e9b-4520-88d9-ba6c22dc71f0" containerName="registry-server" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.633673 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-ndcqp" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.635701 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-mariadb-root-db-secret" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.653719 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-ndcqp"] Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.740545 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.756556 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6542a99c-674f-41dc-ae19-f780414bbbf3-operator-scripts\") pod \"root-account-create-update-ndcqp\" (UID: \"6542a99c-674f-41dc-ae19-f780414bbbf3\") " pod="glance-kuttl-tests/root-account-create-update-ndcqp" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.756601 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n575\" (UniqueName: \"kubernetes.io/projected/6542a99c-674f-41dc-ae19-f780414bbbf3-kube-api-access-9n575\") pod \"root-account-create-update-ndcqp\" (UID: \"6542a99c-674f-41dc-ae19-f780414bbbf3\") " pod="glance-kuttl-tests/root-account-create-update-ndcqp" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.857923 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n575\" (UniqueName: \"kubernetes.io/projected/6542a99c-674f-41dc-ae19-f780414bbbf3-kube-api-access-9n575\") pod \"root-account-create-update-ndcqp\" (UID: \"6542a99c-674f-41dc-ae19-f780414bbbf3\") " pod="glance-kuttl-tests/root-account-create-update-ndcqp" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.858041 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6542a99c-674f-41dc-ae19-f780414bbbf3-operator-scripts\") pod \"root-account-create-update-ndcqp\" (UID: \"6542a99c-674f-41dc-ae19-f780414bbbf3\") " pod="glance-kuttl-tests/root-account-create-update-ndcqp" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.858756 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6542a99c-674f-41dc-ae19-f780414bbbf3-operator-scripts\") pod \"root-account-create-update-ndcqp\" (UID: \"6542a99c-674f-41dc-ae19-f780414bbbf3\") " pod="glance-kuttl-tests/root-account-create-update-ndcqp" Mar 07 04:34:27 crc kubenswrapper[4689]: I0307 04:34:27.878361 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n575\" (UniqueName: \"kubernetes.io/projected/6542a99c-674f-41dc-ae19-f780414bbbf3-kube-api-access-9n575\") pod \"root-account-create-update-ndcqp\" (UID: \"6542a99c-674f-41dc-ae19-f780414bbbf3\") " pod="glance-kuttl-tests/root-account-create-update-ndcqp" Mar 07 04:34:28 crc kubenswrapper[4689]: I0307 04:34:28.021819 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-ndcqp" Mar 07 04:34:28 crc kubenswrapper[4689]: I0307 04:34:28.433523 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-ndcqp"] Mar 07 04:34:28 crc kubenswrapper[4689]: I0307 04:34:28.652443 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-ndcqp" event={"ID":"6542a99c-674f-41dc-ae19-f780414bbbf3","Type":"ContainerStarted","Data":"1da4067e5102555aac456769b1b0ba6f41f94a68c76beb712b50860008fa6a25"} Mar 07 04:34:28 crc kubenswrapper[4689]: I0307 04:34:28.652483 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-ndcqp" event={"ID":"6542a99c-674f-41dc-ae19-f780414bbbf3","Type":"ContainerStarted","Data":"05a9c7db1175e4054848cd52873c80d27f756fede5eee5baefa2c466452168e4"} Mar 07 04:34:28 crc kubenswrapper[4689]: I0307 04:34:28.676138 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/root-account-create-update-ndcqp" podStartSLOduration=1.6761200569999999 podStartE2EDuration="1.676120057s" podCreationTimestamp="2026-03-07 04:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:34:28.671841412 +0000 UTC m=+913.718224911" watchObservedRunningTime="2026-03-07 04:34:28.676120057 +0000 UTC m=+913.722503566" Mar 07 04:34:29 crc kubenswrapper[4689]: I0307 04:34:29.023945 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/openstack-galera-2" podUID="243ddc02-c377-44ac-9b47-2240c3d9efed" containerName="galera" probeResult="failure" output=< Mar 07 04:34:29 crc kubenswrapper[4689]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Mar 07 04:34:29 crc kubenswrapper[4689]: > Mar 07 04:34:29 crc kubenswrapper[4689]: I0307 04:34:29.951429 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9"] Mar 07 04:34:29 crc kubenswrapper[4689]: I0307 04:34:29.953114 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:29 crc kubenswrapper[4689]: I0307 04:34:29.956741 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4j8gt" Mar 07 04:34:29 crc kubenswrapper[4689]: I0307 04:34:29.970534 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9"] Mar 07 04:34:30 crc kubenswrapper[4689]: I0307 04:34:30.003676 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b69bt\" (UniqueName: \"kubernetes.io/projected/1de71b3a-96bf-445c-b321-5f0d10b77523-kube-api-access-b69bt\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9\" (UID: \"1de71b3a-96bf-445c-b321-5f0d10b77523\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:30 crc kubenswrapper[4689]: I0307 04:34:30.003816 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1de71b3a-96bf-445c-b321-5f0d10b77523-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9\" (UID: \"1de71b3a-96bf-445c-b321-5f0d10b77523\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:30 crc kubenswrapper[4689]: I0307 04:34:30.003852 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1de71b3a-96bf-445c-b321-5f0d10b77523-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9\" (UID: \"1de71b3a-96bf-445c-b321-5f0d10b77523\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:30 crc kubenswrapper[4689]: I0307 04:34:30.105565 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b69bt\" (UniqueName: \"kubernetes.io/projected/1de71b3a-96bf-445c-b321-5f0d10b77523-kube-api-access-b69bt\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9\" (UID: \"1de71b3a-96bf-445c-b321-5f0d10b77523\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:30 crc kubenswrapper[4689]: I0307 04:34:30.105700 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1de71b3a-96bf-445c-b321-5f0d10b77523-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9\" (UID: \"1de71b3a-96bf-445c-b321-5f0d10b77523\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:30 crc kubenswrapper[4689]: I0307 04:34:30.105739 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1de71b3a-96bf-445c-b321-5f0d10b77523-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9\" (UID: \"1de71b3a-96bf-445c-b321-5f0d10b77523\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:30 crc kubenswrapper[4689]: I0307 04:34:30.106263 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1de71b3a-96bf-445c-b321-5f0d10b77523-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9\" (UID: \"1de71b3a-96bf-445c-b321-5f0d10b77523\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:30 crc kubenswrapper[4689]: I0307 04:34:30.106422 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1de71b3a-96bf-445c-b321-5f0d10b77523-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9\" (UID: \"1de71b3a-96bf-445c-b321-5f0d10b77523\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:30 crc kubenswrapper[4689]: I0307 04:34:30.135525 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b69bt\" (UniqueName: \"kubernetes.io/projected/1de71b3a-96bf-445c-b321-5f0d10b77523-kube-api-access-b69bt\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9\" (UID: \"1de71b3a-96bf-445c-b321-5f0d10b77523\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:30 crc kubenswrapper[4689]: I0307 04:34:30.283460 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:32 crc kubenswrapper[4689]: W0307 04:34:32.627370 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1de71b3a_96bf_445c_b321_5f0d10b77523.slice/crio-a85b67f7dc89c52e269a472e11196724f63128247274a30d5b3947232cd165dd WatchSource:0}: Error finding container a85b67f7dc89c52e269a472e11196724f63128247274a30d5b3947232cd165dd: Status 404 returned error can't find the container with id a85b67f7dc89c52e269a472e11196724f63128247274a30d5b3947232cd165dd Mar 07 04:34:32 crc kubenswrapper[4689]: I0307 04:34:32.627820 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9"] Mar 07 04:34:32 crc kubenswrapper[4689]: I0307 04:34:32.685764 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" event={"ID":"1de71b3a-96bf-445c-b321-5f0d10b77523","Type":"ContainerStarted","Data":"a85b67f7dc89c52e269a472e11196724f63128247274a30d5b3947232cd165dd"} Mar 07 04:34:33 crc kubenswrapper[4689]: I0307 04:34:33.697090 4689 generic.go:334] "Generic (PLEG): container finished" podID="1de71b3a-96bf-445c-b321-5f0d10b77523" containerID="05b2be4ed567110edd549eb6ed857425bac74c0325e23a8df744b6cc8cb33ca8" exitCode=0 Mar 07 04:34:33 crc kubenswrapper[4689]: I0307 04:34:33.697150 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" event={"ID":"1de71b3a-96bf-445c-b321-5f0d10b77523","Type":"ContainerDied","Data":"05b2be4ed567110edd549eb6ed857425bac74c0325e23a8df744b6cc8cb33ca8"} Mar 07 04:34:34 crc kubenswrapper[4689]: I0307 04:34:34.709425 4689 generic.go:334] "Generic (PLEG): container finished" podID="1de71b3a-96bf-445c-b321-5f0d10b77523" containerID="59b9470ac28854b2cd352e30c2faeaae9c7143dd5cd89f202217c105ff4dc226" exitCode=0 Mar 07 04:34:34 crc kubenswrapper[4689]: I0307 04:34:34.709504 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" event={"ID":"1de71b3a-96bf-445c-b321-5f0d10b77523","Type":"ContainerDied","Data":"59b9470ac28854b2cd352e30c2faeaae9c7143dd5cd89f202217c105ff4dc226"} Mar 07 04:34:34 crc kubenswrapper[4689]: I0307 04:34:34.714940 4689 generic.go:334] "Generic (PLEG): container finished" podID="6542a99c-674f-41dc-ae19-f780414bbbf3" containerID="1da4067e5102555aac456769b1b0ba6f41f94a68c76beb712b50860008fa6a25" exitCode=0 Mar 07 04:34:34 crc kubenswrapper[4689]: I0307 04:34:34.714981 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-ndcqp" event={"ID":"6542a99c-674f-41dc-ae19-f780414bbbf3","Type":"ContainerDied","Data":"1da4067e5102555aac456769b1b0ba6f41f94a68c76beb712b50860008fa6a25"} Mar 07 04:34:35 crc kubenswrapper[4689]: I0307 04:34:35.731233 4689 generic.go:334] "Generic (PLEG): container finished" podID="1de71b3a-96bf-445c-b321-5f0d10b77523" containerID="b02927d1de884d5663972f872f84754a8034f7702a4b935529b0f8e12587ce6e" exitCode=0 Mar 07 04:34:35 crc kubenswrapper[4689]: I0307 04:34:35.733031 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" event={"ID":"1de71b3a-96bf-445c-b321-5f0d10b77523","Type":"ContainerDied","Data":"b02927d1de884d5663972f872f84754a8034f7702a4b935529b0f8e12587ce6e"} Mar 07 04:34:36 crc kubenswrapper[4689]: I0307 04:34:36.131388 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-ndcqp" Mar 07 04:34:36 crc kubenswrapper[4689]: I0307 04:34:36.294423 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6542a99c-674f-41dc-ae19-f780414bbbf3-operator-scripts\") pod \"6542a99c-674f-41dc-ae19-f780414bbbf3\" (UID: \"6542a99c-674f-41dc-ae19-f780414bbbf3\") " Mar 07 04:34:36 crc kubenswrapper[4689]: I0307 04:34:36.295115 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n575\" (UniqueName: \"kubernetes.io/projected/6542a99c-674f-41dc-ae19-f780414bbbf3-kube-api-access-9n575\") pod \"6542a99c-674f-41dc-ae19-f780414bbbf3\" (UID: \"6542a99c-674f-41dc-ae19-f780414bbbf3\") " Mar 07 04:34:36 crc kubenswrapper[4689]: I0307 04:34:36.295613 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6542a99c-674f-41dc-ae19-f780414bbbf3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6542a99c-674f-41dc-ae19-f780414bbbf3" (UID: "6542a99c-674f-41dc-ae19-f780414bbbf3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:34:36 crc kubenswrapper[4689]: I0307 04:34:36.295893 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6542a99c-674f-41dc-ae19-f780414bbbf3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:34:36 crc kubenswrapper[4689]: I0307 04:34:36.309117 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6542a99c-674f-41dc-ae19-f780414bbbf3-kube-api-access-9n575" (OuterVolumeSpecName: "kube-api-access-9n575") pod "6542a99c-674f-41dc-ae19-f780414bbbf3" (UID: "6542a99c-674f-41dc-ae19-f780414bbbf3"). InnerVolumeSpecName "kube-api-access-9n575". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:34:36 crc kubenswrapper[4689]: I0307 04:34:36.396817 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n575\" (UniqueName: \"kubernetes.io/projected/6542a99c-674f-41dc-ae19-f780414bbbf3-kube-api-access-9n575\") on node \"crc\" DevicePath \"\"" Mar 07 04:34:36 crc kubenswrapper[4689]: I0307 04:34:36.741279 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-ndcqp" event={"ID":"6542a99c-674f-41dc-ae19-f780414bbbf3","Type":"ContainerDied","Data":"05a9c7db1175e4054848cd52873c80d27f756fede5eee5baefa2c466452168e4"} Mar 07 04:34:36 crc kubenswrapper[4689]: I0307 04:34:36.741318 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-ndcqp" Mar 07 04:34:36 crc kubenswrapper[4689]: I0307 04:34:36.741334 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05a9c7db1175e4054848cd52873c80d27f756fede5eee5baefa2c466452168e4" Mar 07 04:34:36 crc kubenswrapper[4689]: I0307 04:34:36.994983 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:37 crc kubenswrapper[4689]: I0307 04:34:37.106015 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1de71b3a-96bf-445c-b321-5f0d10b77523-bundle\") pod \"1de71b3a-96bf-445c-b321-5f0d10b77523\" (UID: \"1de71b3a-96bf-445c-b321-5f0d10b77523\") " Mar 07 04:34:37 crc kubenswrapper[4689]: I0307 04:34:37.106091 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1de71b3a-96bf-445c-b321-5f0d10b77523-util\") pod \"1de71b3a-96bf-445c-b321-5f0d10b77523\" (UID: \"1de71b3a-96bf-445c-b321-5f0d10b77523\") " Mar 07 04:34:37 crc kubenswrapper[4689]: I0307 04:34:37.106225 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b69bt\" (UniqueName: \"kubernetes.io/projected/1de71b3a-96bf-445c-b321-5f0d10b77523-kube-api-access-b69bt\") pod \"1de71b3a-96bf-445c-b321-5f0d10b77523\" (UID: \"1de71b3a-96bf-445c-b321-5f0d10b77523\") " Mar 07 04:34:37 crc kubenswrapper[4689]: I0307 04:34:37.106716 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1de71b3a-96bf-445c-b321-5f0d10b77523-bundle" (OuterVolumeSpecName: "bundle") pod "1de71b3a-96bf-445c-b321-5f0d10b77523" (UID: "1de71b3a-96bf-445c-b321-5f0d10b77523"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:34:37 crc kubenswrapper[4689]: I0307 04:34:37.112826 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de71b3a-96bf-445c-b321-5f0d10b77523-kube-api-access-b69bt" (OuterVolumeSpecName: "kube-api-access-b69bt") pod "1de71b3a-96bf-445c-b321-5f0d10b77523" (UID: "1de71b3a-96bf-445c-b321-5f0d10b77523"). InnerVolumeSpecName "kube-api-access-b69bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:34:37 crc kubenswrapper[4689]: I0307 04:34:37.130825 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1de71b3a-96bf-445c-b321-5f0d10b77523-util" (OuterVolumeSpecName: "util") pod "1de71b3a-96bf-445c-b321-5f0d10b77523" (UID: "1de71b3a-96bf-445c-b321-5f0d10b77523"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:34:37 crc kubenswrapper[4689]: I0307 04:34:37.207294 4689 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1de71b3a-96bf-445c-b321-5f0d10b77523-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:34:37 crc kubenswrapper[4689]: I0307 04:34:37.207335 4689 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1de71b3a-96bf-445c-b321-5f0d10b77523-util\") on node \"crc\" DevicePath \"\"" Mar 07 04:34:37 crc kubenswrapper[4689]: I0307 04:34:37.207348 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b69bt\" (UniqueName: \"kubernetes.io/projected/1de71b3a-96bf-445c-b321-5f0d10b77523-kube-api-access-b69bt\") on node \"crc\" DevicePath \"\"" Mar 07 04:34:37 crc kubenswrapper[4689]: I0307 04:34:37.754671 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" event={"ID":"1de71b3a-96bf-445c-b321-5f0d10b77523","Type":"ContainerDied","Data":"a85b67f7dc89c52e269a472e11196724f63128247274a30d5b3947232cd165dd"} Mar 07 04:34:37 crc kubenswrapper[4689]: I0307 04:34:37.754734 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a85b67f7dc89c52e269a472e11196724f63128247274a30d5b3947232cd165dd" Mar 07 04:34:37 crc kubenswrapper[4689]: I0307 04:34:37.754733 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9" Mar 07 04:34:38 crc kubenswrapper[4689]: I0307 04:34:38.602691 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:34:38 crc kubenswrapper[4689]: I0307 04:34:38.714312 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:34:39 crc kubenswrapper[4689]: I0307 04:34:39.622327 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:34:39 crc kubenswrapper[4689]: I0307 04:34:39.715405 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.374959 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx"] Mar 07 04:34:46 crc kubenswrapper[4689]: E0307 04:34:46.375247 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6542a99c-674f-41dc-ae19-f780414bbbf3" containerName="mariadb-account-create-update" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.375261 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="6542a99c-674f-41dc-ae19-f780414bbbf3" containerName="mariadb-account-create-update" Mar 07 04:34:46 crc kubenswrapper[4689]: E0307 04:34:46.375273 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de71b3a-96bf-445c-b321-5f0d10b77523" containerName="util" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.375279 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de71b3a-96bf-445c-b321-5f0d10b77523" containerName="util" Mar 07 04:34:46 crc kubenswrapper[4689]: E0307 04:34:46.375291 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de71b3a-96bf-445c-b321-5f0d10b77523" containerName="pull" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.375297 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de71b3a-96bf-445c-b321-5f0d10b77523" containerName="pull" Mar 07 04:34:46 crc kubenswrapper[4689]: E0307 04:34:46.375312 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de71b3a-96bf-445c-b321-5f0d10b77523" containerName="extract" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.375319 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de71b3a-96bf-445c-b321-5f0d10b77523" containerName="extract" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.375416 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="6542a99c-674f-41dc-ae19-f780414bbbf3" containerName="mariadb-account-create-update" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.375430 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="1de71b3a-96bf-445c-b321-5f0d10b77523" containerName="extract" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.375873 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.378378 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-9lnnm" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.393197 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx"] Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.549960 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tnqb\" (UniqueName: \"kubernetes.io/projected/eaa03e1b-1007-4b01-9753-7c0ffa27b09c-kube-api-access-7tnqb\") pod \"rabbitmq-cluster-operator-779fc9694b-9mvxx\" (UID: \"eaa03e1b-1007-4b01-9753-7c0ffa27b09c\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.651675 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tnqb\" (UniqueName: \"kubernetes.io/projected/eaa03e1b-1007-4b01-9753-7c0ffa27b09c-kube-api-access-7tnqb\") pod \"rabbitmq-cluster-operator-779fc9694b-9mvxx\" (UID: \"eaa03e1b-1007-4b01-9753-7c0ffa27b09c\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.669968 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tnqb\" (UniqueName: \"kubernetes.io/projected/eaa03e1b-1007-4b01-9753-7c0ffa27b09c-kube-api-access-7tnqb\") pod \"rabbitmq-cluster-operator-779fc9694b-9mvxx\" (UID: \"eaa03e1b-1007-4b01-9753-7c0ffa27b09c\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" Mar 07 04:34:46 crc kubenswrapper[4689]: I0307 04:34:46.740606 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" Mar 07 04:34:47 crc kubenswrapper[4689]: I0307 04:34:47.209291 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx"] Mar 07 04:34:47 crc kubenswrapper[4689]: I0307 04:34:47.836601 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" event={"ID":"eaa03e1b-1007-4b01-9753-7c0ffa27b09c","Type":"ContainerStarted","Data":"4e8c8445e8a576e4ef56a357206f21f553a14c8f1b7a7b3a48c01309f05bef87"} Mar 07 04:34:49 crc kubenswrapper[4689]: I0307 04:34:49.450261 4689 scope.go:117] "RemoveContainer" containerID="4b82357dc93d44463e296016d0781a5e7e0aa4b1e2f16303973938415ee63403" Mar 07 04:34:51 crc kubenswrapper[4689]: I0307 04:34:51.863606 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" event={"ID":"eaa03e1b-1007-4b01-9753-7c0ffa27b09c","Type":"ContainerStarted","Data":"cfdab96e6f02eeef3b9343183918f4011d67b29e8ce0838ff558df8dae482a4b"} Mar 07 04:34:51 crc kubenswrapper[4689]: I0307 04:34:51.902014 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" podStartSLOduration=2.348266984 podStartE2EDuration="5.901989864s" podCreationTimestamp="2026-03-07 04:34:46 +0000 UTC" firstStartedPulling="2026-03-07 04:34:47.218755534 +0000 UTC m=+932.265139053" lastFinishedPulling="2026-03-07 04:34:50.772478444 +0000 UTC m=+935.818861933" observedRunningTime="2026-03-07 04:34:51.893674313 +0000 UTC m=+936.940057842" watchObservedRunningTime="2026-03-07 04:34:51.901989864 +0000 UTC m=+936.948373393" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.343142 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.344621 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.346850 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.346920 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-5m65h" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.346849 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.346984 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.347670 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.365563 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.482974 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt8nj\" (UniqueName: \"kubernetes.io/projected/b8758a96-64ae-4c03-b392-5aa8c68cc641-kube-api-access-zt8nj\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.483387 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8758a96-64ae-4c03-b392-5aa8c68cc641-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.483600 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8758a96-64ae-4c03-b392-5aa8c68cc641-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.483794 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8758a96-64ae-4c03-b392-5aa8c68cc641-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.483953 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.484132 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.484345 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.484499 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.586622 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8758a96-64ae-4c03-b392-5aa8c68cc641-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.586690 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.586755 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.586815 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.586882 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.587751 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.588279 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.588624 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt8nj\" (UniqueName: \"kubernetes.io/projected/b8758a96-64ae-4c03-b392-5aa8c68cc641-kube-api-access-zt8nj\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.588691 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8758a96-64ae-4c03-b392-5aa8c68cc641-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.588769 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8758a96-64ae-4c03-b392-5aa8c68cc641-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.591563 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8758a96-64ae-4c03-b392-5aa8c68cc641-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.595837 4689 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.596044 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b55fed7e9f0b3fc6a217c08f4f3850c3b2bb6969f1d1384d27fd2b36387940be/globalmount\"" pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.598573 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8758a96-64ae-4c03-b392-5aa8c68cc641-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.600430 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8758a96-64ae-4c03-b392-5aa8c68cc641-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.601906 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.636797 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt8nj\" (UniqueName: \"kubernetes.io/projected/b8758a96-64ae-4c03-b392-5aa8c68cc641-kube-api-access-zt8nj\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.639478 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\") pod \"rabbitmq-server-0\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:54 crc kubenswrapper[4689]: I0307 04:34:54.662882 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:34:55 crc kubenswrapper[4689]: I0307 04:34:55.184542 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Mar 07 04:34:55 crc kubenswrapper[4689]: W0307 04:34:55.190579 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8758a96_64ae_4c03_b392_5aa8c68cc641.slice/crio-91dca4fbe16cbc8f22bd82b5b3d4386c8f4e61b3f5c4be41296b365b7863fa0e WatchSource:0}: Error finding container 91dca4fbe16cbc8f22bd82b5b3d4386c8f4e61b3f5c4be41296b365b7863fa0e: Status 404 returned error can't find the container with id 91dca4fbe16cbc8f22bd82b5b3d4386c8f4e61b3f5c4be41296b365b7863fa0e Mar 07 04:34:55 crc kubenswrapper[4689]: I0307 04:34:55.904008 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"b8758a96-64ae-4c03-b392-5aa8c68cc641","Type":"ContainerStarted","Data":"91dca4fbe16cbc8f22bd82b5b3d4386c8f4e61b3f5c4be41296b365b7863fa0e"} Mar 07 04:34:55 crc kubenswrapper[4689]: I0307 04:34:55.922499 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-k6756"] Mar 07 04:34:55 crc kubenswrapper[4689]: I0307 04:34:55.923989 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-k6756" Mar 07 04:34:55 crc kubenswrapper[4689]: I0307 04:34:55.928694 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-7xlgp" Mar 07 04:34:55 crc kubenswrapper[4689]: I0307 04:34:55.941158 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqbrd\" (UniqueName: \"kubernetes.io/projected/ef3f79a8-f9ba-43c7-ac9e-04c1094ced84-kube-api-access-tqbrd\") pod \"keystone-operator-index-k6756\" (UID: \"ef3f79a8-f9ba-43c7-ac9e-04c1094ced84\") " pod="openstack-operators/keystone-operator-index-k6756" Mar 07 04:34:55 crc kubenswrapper[4689]: I0307 04:34:55.961775 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-k6756"] Mar 07 04:34:56 crc kubenswrapper[4689]: I0307 04:34:56.043014 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqbrd\" (UniqueName: \"kubernetes.io/projected/ef3f79a8-f9ba-43c7-ac9e-04c1094ced84-kube-api-access-tqbrd\") pod \"keystone-operator-index-k6756\" (UID: \"ef3f79a8-f9ba-43c7-ac9e-04c1094ced84\") " pod="openstack-operators/keystone-operator-index-k6756" Mar 07 04:34:56 crc kubenswrapper[4689]: I0307 04:34:56.067611 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqbrd\" (UniqueName: \"kubernetes.io/projected/ef3f79a8-f9ba-43c7-ac9e-04c1094ced84-kube-api-access-tqbrd\") pod \"keystone-operator-index-k6756\" (UID: \"ef3f79a8-f9ba-43c7-ac9e-04c1094ced84\") " pod="openstack-operators/keystone-operator-index-k6756" Mar 07 04:34:56 crc kubenswrapper[4689]: I0307 04:34:56.266785 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-k6756" Mar 07 04:34:56 crc kubenswrapper[4689]: I0307 04:34:56.829359 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-k6756"] Mar 07 04:34:56 crc kubenswrapper[4689]: I0307 04:34:56.914728 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-k6756" event={"ID":"ef3f79a8-f9ba-43c7-ac9e-04c1094ced84","Type":"ContainerStarted","Data":"e01e50b85f4dcf99bfb6075f465784a98544c1f532e2dc3884bf220a506dab59"} Mar 07 04:35:00 crc kubenswrapper[4689]: I0307 04:35:00.499994 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-k6756"] Mar 07 04:35:01 crc kubenswrapper[4689]: I0307 04:35:01.110371 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-dtt2x"] Mar 07 04:35:01 crc kubenswrapper[4689]: I0307 04:35:01.113730 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-dtt2x" Mar 07 04:35:01 crc kubenswrapper[4689]: I0307 04:35:01.120694 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-dtt2x"] Mar 07 04:35:01 crc kubenswrapper[4689]: I0307 04:35:01.227307 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g55n7\" (UniqueName: \"kubernetes.io/projected/c0bfd96e-646a-4e38-bcd8-c77623fea007-kube-api-access-g55n7\") pod \"keystone-operator-index-dtt2x\" (UID: \"c0bfd96e-646a-4e38-bcd8-c77623fea007\") " pod="openstack-operators/keystone-operator-index-dtt2x" Mar 07 04:35:01 crc kubenswrapper[4689]: I0307 04:35:01.328424 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g55n7\" (UniqueName: \"kubernetes.io/projected/c0bfd96e-646a-4e38-bcd8-c77623fea007-kube-api-access-g55n7\") pod \"keystone-operator-index-dtt2x\" (UID: \"c0bfd96e-646a-4e38-bcd8-c77623fea007\") " pod="openstack-operators/keystone-operator-index-dtt2x" Mar 07 04:35:01 crc kubenswrapper[4689]: I0307 04:35:01.368645 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g55n7\" (UniqueName: \"kubernetes.io/projected/c0bfd96e-646a-4e38-bcd8-c77623fea007-kube-api-access-g55n7\") pod \"keystone-operator-index-dtt2x\" (UID: \"c0bfd96e-646a-4e38-bcd8-c77623fea007\") " pod="openstack-operators/keystone-operator-index-dtt2x" Mar 07 04:35:01 crc kubenswrapper[4689]: I0307 04:35:01.448377 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-dtt2x" Mar 07 04:35:02 crc kubenswrapper[4689]: I0307 04:35:02.486047 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-dtt2x"] Mar 07 04:35:02 crc kubenswrapper[4689]: W0307 04:35:02.498050 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0bfd96e_646a_4e38_bcd8_c77623fea007.slice/crio-8ea83a2121fad5178e4c5403b2378339092453436159ee1d229ad3c08b253d22 WatchSource:0}: Error finding container 8ea83a2121fad5178e4c5403b2378339092453436159ee1d229ad3c08b253d22: Status 404 returned error can't find the container with id 8ea83a2121fad5178e4c5403b2378339092453436159ee1d229ad3c08b253d22 Mar 07 04:35:02 crc kubenswrapper[4689]: I0307 04:35:02.501988 4689 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 04:35:02 crc kubenswrapper[4689]: I0307 04:35:02.973745 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-dtt2x" event={"ID":"c0bfd96e-646a-4e38-bcd8-c77623fea007","Type":"ContainerStarted","Data":"8ea83a2121fad5178e4c5403b2378339092453436159ee1d229ad3c08b253d22"} Mar 07 04:35:02 crc kubenswrapper[4689]: I0307 04:35:02.976100 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-k6756" event={"ID":"ef3f79a8-f9ba-43c7-ac9e-04c1094ced84","Type":"ContainerStarted","Data":"dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921"} Mar 07 04:35:02 crc kubenswrapper[4689]: I0307 04:35:02.976281 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-k6756" podUID="ef3f79a8-f9ba-43c7-ac9e-04c1094ced84" containerName="registry-server" containerID="cri-o://dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921" gracePeriod=2 Mar 07 04:35:03 crc kubenswrapper[4689]: I0307 04:35:03.007771 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-k6756" podStartSLOduration=2.829684192 podStartE2EDuration="8.007753261s" podCreationTimestamp="2026-03-07 04:34:55 +0000 UTC" firstStartedPulling="2026-03-07 04:34:56.849373631 +0000 UTC m=+941.895757120" lastFinishedPulling="2026-03-07 04:35:02.0274427 +0000 UTC m=+947.073826189" observedRunningTime="2026-03-07 04:35:03.005246764 +0000 UTC m=+948.051630263" watchObservedRunningTime="2026-03-07 04:35:03.007753261 +0000 UTC m=+948.054136760" Mar 07 04:35:03 crc kubenswrapper[4689]: I0307 04:35:03.372553 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-k6756" Mar 07 04:35:03 crc kubenswrapper[4689]: I0307 04:35:03.559435 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqbrd\" (UniqueName: \"kubernetes.io/projected/ef3f79a8-f9ba-43c7-ac9e-04c1094ced84-kube-api-access-tqbrd\") pod \"ef3f79a8-f9ba-43c7-ac9e-04c1094ced84\" (UID: \"ef3f79a8-f9ba-43c7-ac9e-04c1094ced84\") " Mar 07 04:35:03 crc kubenswrapper[4689]: I0307 04:35:03.566917 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef3f79a8-f9ba-43c7-ac9e-04c1094ced84-kube-api-access-tqbrd" (OuterVolumeSpecName: "kube-api-access-tqbrd") pod "ef3f79a8-f9ba-43c7-ac9e-04c1094ced84" (UID: "ef3f79a8-f9ba-43c7-ac9e-04c1094ced84"). InnerVolumeSpecName "kube-api-access-tqbrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:35:03 crc kubenswrapper[4689]: I0307 04:35:03.661585 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqbrd\" (UniqueName: \"kubernetes.io/projected/ef3f79a8-f9ba-43c7-ac9e-04c1094ced84-kube-api-access-tqbrd\") on node \"crc\" DevicePath \"\"" Mar 07 04:35:03 crc kubenswrapper[4689]: I0307 04:35:03.984739 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"b8758a96-64ae-4c03-b392-5aa8c68cc641","Type":"ContainerStarted","Data":"df73ae97f7f931ce51d921b931364eadb91b0bc93313e06219d09747cc840f0e"} Mar 07 04:35:03 crc kubenswrapper[4689]: I0307 04:35:03.987901 4689 generic.go:334] "Generic (PLEG): container finished" podID="ef3f79a8-f9ba-43c7-ac9e-04c1094ced84" containerID="dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921" exitCode=0 Mar 07 04:35:03 crc kubenswrapper[4689]: I0307 04:35:03.987981 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-k6756" event={"ID":"ef3f79a8-f9ba-43c7-ac9e-04c1094ced84","Type":"ContainerDied","Data":"dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921"} Mar 07 04:35:03 crc kubenswrapper[4689]: I0307 04:35:03.988007 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-k6756" event={"ID":"ef3f79a8-f9ba-43c7-ac9e-04c1094ced84","Type":"ContainerDied","Data":"e01e50b85f4dcf99bfb6075f465784a98544c1f532e2dc3884bf220a506dab59"} Mar 07 04:35:03 crc kubenswrapper[4689]: I0307 04:35:03.988027 4689 scope.go:117] "RemoveContainer" containerID="dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921" Mar 07 04:35:03 crc kubenswrapper[4689]: I0307 04:35:03.988151 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-k6756" Mar 07 04:35:03 crc kubenswrapper[4689]: I0307 04:35:03.991733 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-dtt2x" event={"ID":"c0bfd96e-646a-4e38-bcd8-c77623fea007","Type":"ContainerStarted","Data":"a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1"} Mar 07 04:35:04 crc kubenswrapper[4689]: I0307 04:35:04.036448 4689 scope.go:117] "RemoveContainer" containerID="dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921" Mar 07 04:35:04 crc kubenswrapper[4689]: E0307 04:35:04.037102 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921\": container with ID starting with dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921 not found: ID does not exist" containerID="dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921" Mar 07 04:35:04 crc kubenswrapper[4689]: I0307 04:35:04.037195 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921"} err="failed to get container status \"dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921\": rpc error: code = NotFound desc = could not find container \"dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921\": container with ID starting with dd17e1f5d0bcf0c24a5fbeaf29a285719c725c49236953fcb952c4a7b8bcc921 not found: ID does not exist" Mar 07 04:35:04 crc kubenswrapper[4689]: I0307 04:35:04.080610 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-dtt2x" podStartSLOduration=2.539558851 podStartE2EDuration="3.080573746s" podCreationTimestamp="2026-03-07 04:35:01 +0000 UTC" firstStartedPulling="2026-03-07 04:35:02.501595741 +0000 UTC m=+947.547979270" lastFinishedPulling="2026-03-07 04:35:03.042610676 +0000 UTC m=+948.088994165" observedRunningTime="2026-03-07 04:35:04.061717085 +0000 UTC m=+949.108100594" watchObservedRunningTime="2026-03-07 04:35:04.080573746 +0000 UTC m=+949.126957285" Mar 07 04:35:04 crc kubenswrapper[4689]: I0307 04:35:04.091395 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-k6756"] Mar 07 04:35:04 crc kubenswrapper[4689]: I0307 04:35:04.099402 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-k6756"] Mar 07 04:35:05 crc kubenswrapper[4689]: I0307 04:35:05.841005 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef3f79a8-f9ba-43c7-ac9e-04c1094ced84" path="/var/lib/kubelet/pods/ef3f79a8-f9ba-43c7-ac9e-04c1094ced84/volumes" Mar 07 04:35:11 crc kubenswrapper[4689]: I0307 04:35:11.448815 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-dtt2x" Mar 07 04:35:11 crc kubenswrapper[4689]: I0307 04:35:11.449491 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-dtt2x" Mar 07 04:35:11 crc kubenswrapper[4689]: I0307 04:35:11.495155 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-dtt2x" Mar 07 04:35:12 crc kubenswrapper[4689]: I0307 04:35:12.094642 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-dtt2x" Mar 07 04:35:13 crc kubenswrapper[4689]: I0307 04:35:13.773164 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9"] Mar 07 04:35:13 crc kubenswrapper[4689]: E0307 04:35:13.773693 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3f79a8-f9ba-43c7-ac9e-04c1094ced84" containerName="registry-server" Mar 07 04:35:13 crc kubenswrapper[4689]: I0307 04:35:13.773704 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3f79a8-f9ba-43c7-ac9e-04c1094ced84" containerName="registry-server" Mar 07 04:35:13 crc kubenswrapper[4689]: I0307 04:35:13.773838 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef3f79a8-f9ba-43c7-ac9e-04c1094ced84" containerName="registry-server" Mar 07 04:35:13 crc kubenswrapper[4689]: I0307 04:35:13.774717 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:13 crc kubenswrapper[4689]: I0307 04:35:13.778088 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4j8gt" Mar 07 04:35:13 crc kubenswrapper[4689]: I0307 04:35:13.784349 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9"] Mar 07 04:35:13 crc kubenswrapper[4689]: I0307 04:35:13.924886 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/789a8575-312c-4ce6-96e5-ccc4c7f8373f-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9\" (UID: \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:13 crc kubenswrapper[4689]: I0307 04:35:13.924974 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/789a8575-312c-4ce6-96e5-ccc4c7f8373f-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9\" (UID: \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:13 crc kubenswrapper[4689]: I0307 04:35:13.925024 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28ndp\" (UniqueName: \"kubernetes.io/projected/789a8575-312c-4ce6-96e5-ccc4c7f8373f-kube-api-access-28ndp\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9\" (UID: \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:14 crc kubenswrapper[4689]: I0307 04:35:14.026907 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/789a8575-312c-4ce6-96e5-ccc4c7f8373f-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9\" (UID: \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:14 crc kubenswrapper[4689]: I0307 04:35:14.027001 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28ndp\" (UniqueName: \"kubernetes.io/projected/789a8575-312c-4ce6-96e5-ccc4c7f8373f-kube-api-access-28ndp\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9\" (UID: \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:14 crc kubenswrapper[4689]: I0307 04:35:14.027072 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/789a8575-312c-4ce6-96e5-ccc4c7f8373f-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9\" (UID: \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:14 crc kubenswrapper[4689]: I0307 04:35:14.027520 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/789a8575-312c-4ce6-96e5-ccc4c7f8373f-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9\" (UID: \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:14 crc kubenswrapper[4689]: I0307 04:35:14.027655 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/789a8575-312c-4ce6-96e5-ccc4c7f8373f-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9\" (UID: \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:14 crc kubenswrapper[4689]: I0307 04:35:14.058152 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28ndp\" (UniqueName: \"kubernetes.io/projected/789a8575-312c-4ce6-96e5-ccc4c7f8373f-kube-api-access-28ndp\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9\" (UID: \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:14 crc kubenswrapper[4689]: I0307 04:35:14.109045 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:14 crc kubenswrapper[4689]: I0307 04:35:14.559265 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9"] Mar 07 04:35:15 crc kubenswrapper[4689]: I0307 04:35:15.109591 4689 generic.go:334] "Generic (PLEG): container finished" podID="789a8575-312c-4ce6-96e5-ccc4c7f8373f" containerID="72d3c13287afb9eebc18fb3588b7edfdd594507fe230f25ccb60e256a61abcd1" exitCode=0 Mar 07 04:35:15 crc kubenswrapper[4689]: I0307 04:35:15.109727 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" event={"ID":"789a8575-312c-4ce6-96e5-ccc4c7f8373f","Type":"ContainerDied","Data":"72d3c13287afb9eebc18fb3588b7edfdd594507fe230f25ccb60e256a61abcd1"} Mar 07 04:35:15 crc kubenswrapper[4689]: I0307 04:35:15.110136 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" event={"ID":"789a8575-312c-4ce6-96e5-ccc4c7f8373f","Type":"ContainerStarted","Data":"7254ce8486914e311115057f4b87513b688c51db44cd1c197f6fa9d215be1c48"} Mar 07 04:35:17 crc kubenswrapper[4689]: I0307 04:35:17.132511 4689 generic.go:334] "Generic (PLEG): container finished" podID="789a8575-312c-4ce6-96e5-ccc4c7f8373f" containerID="12513990d7ac6e8d6b3f612b66e2585a45622b21da1c3761348fca6184dca979" exitCode=0 Mar 07 04:35:17 crc kubenswrapper[4689]: I0307 04:35:17.132643 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" event={"ID":"789a8575-312c-4ce6-96e5-ccc4c7f8373f","Type":"ContainerDied","Data":"12513990d7ac6e8d6b3f612b66e2585a45622b21da1c3761348fca6184dca979"} Mar 07 04:35:18 crc kubenswrapper[4689]: I0307 04:35:18.143975 4689 generic.go:334] "Generic (PLEG): container finished" podID="789a8575-312c-4ce6-96e5-ccc4c7f8373f" containerID="a9a14cb1784c5d48776ae86aaeb6eb42392ff6406deca6fe1b8f76e6a1ff7cc7" exitCode=0 Mar 07 04:35:18 crc kubenswrapper[4689]: I0307 04:35:18.144040 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" event={"ID":"789a8575-312c-4ce6-96e5-ccc4c7f8373f","Type":"ContainerDied","Data":"a9a14cb1784c5d48776ae86aaeb6eb42392ff6406deca6fe1b8f76e6a1ff7cc7"} Mar 07 04:35:19 crc kubenswrapper[4689]: I0307 04:35:19.466792 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:19 crc kubenswrapper[4689]: I0307 04:35:19.622503 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/789a8575-312c-4ce6-96e5-ccc4c7f8373f-bundle\") pod \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\" (UID: \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\") " Mar 07 04:35:19 crc kubenswrapper[4689]: I0307 04:35:19.622807 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28ndp\" (UniqueName: \"kubernetes.io/projected/789a8575-312c-4ce6-96e5-ccc4c7f8373f-kube-api-access-28ndp\") pod \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\" (UID: \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\") " Mar 07 04:35:19 crc kubenswrapper[4689]: I0307 04:35:19.623003 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/789a8575-312c-4ce6-96e5-ccc4c7f8373f-util\") pod \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\" (UID: \"789a8575-312c-4ce6-96e5-ccc4c7f8373f\") " Mar 07 04:35:19 crc kubenswrapper[4689]: I0307 04:35:19.624294 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/789a8575-312c-4ce6-96e5-ccc4c7f8373f-bundle" (OuterVolumeSpecName: "bundle") pod "789a8575-312c-4ce6-96e5-ccc4c7f8373f" (UID: "789a8575-312c-4ce6-96e5-ccc4c7f8373f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:35:19 crc kubenswrapper[4689]: I0307 04:35:19.633570 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/789a8575-312c-4ce6-96e5-ccc4c7f8373f-kube-api-access-28ndp" (OuterVolumeSpecName: "kube-api-access-28ndp") pod "789a8575-312c-4ce6-96e5-ccc4c7f8373f" (UID: "789a8575-312c-4ce6-96e5-ccc4c7f8373f"). InnerVolumeSpecName "kube-api-access-28ndp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:35:19 crc kubenswrapper[4689]: I0307 04:35:19.654263 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/789a8575-312c-4ce6-96e5-ccc4c7f8373f-util" (OuterVolumeSpecName: "util") pod "789a8575-312c-4ce6-96e5-ccc4c7f8373f" (UID: "789a8575-312c-4ce6-96e5-ccc4c7f8373f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:35:19 crc kubenswrapper[4689]: I0307 04:35:19.725701 4689 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/789a8575-312c-4ce6-96e5-ccc4c7f8373f-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:35:19 crc kubenswrapper[4689]: I0307 04:35:19.725851 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28ndp\" (UniqueName: \"kubernetes.io/projected/789a8575-312c-4ce6-96e5-ccc4c7f8373f-kube-api-access-28ndp\") on node \"crc\" DevicePath \"\"" Mar 07 04:35:19 crc kubenswrapper[4689]: I0307 04:35:19.725874 4689 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/789a8575-312c-4ce6-96e5-ccc4c7f8373f-util\") on node \"crc\" DevicePath \"\"" Mar 07 04:35:20 crc kubenswrapper[4689]: I0307 04:35:20.163421 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" event={"ID":"789a8575-312c-4ce6-96e5-ccc4c7f8373f","Type":"ContainerDied","Data":"7254ce8486914e311115057f4b87513b688c51db44cd1c197f6fa9d215be1c48"} Mar 07 04:35:20 crc kubenswrapper[4689]: I0307 04:35:20.163863 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7254ce8486914e311115057f4b87513b688c51db44cd1c197f6fa9d215be1c48" Mar 07 04:35:20 crc kubenswrapper[4689]: I0307 04:35:20.163492 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9" Mar 07 04:35:29 crc kubenswrapper[4689]: I0307 04:35:29.190147 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:35:29 crc kubenswrapper[4689]: I0307 04:35:29.190632 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:35:30 crc kubenswrapper[4689]: I0307 04:35:30.992421 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v"] Mar 07 04:35:30 crc kubenswrapper[4689]: E0307 04:35:30.992861 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="789a8575-312c-4ce6-96e5-ccc4c7f8373f" containerName="extract" Mar 07 04:35:30 crc kubenswrapper[4689]: I0307 04:35:30.992872 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="789a8575-312c-4ce6-96e5-ccc4c7f8373f" containerName="extract" Mar 07 04:35:30 crc kubenswrapper[4689]: E0307 04:35:30.992890 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="789a8575-312c-4ce6-96e5-ccc4c7f8373f" containerName="pull" Mar 07 04:35:30 crc kubenswrapper[4689]: I0307 04:35:30.992895 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="789a8575-312c-4ce6-96e5-ccc4c7f8373f" containerName="pull" Mar 07 04:35:30 crc kubenswrapper[4689]: E0307 04:35:30.992905 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="789a8575-312c-4ce6-96e5-ccc4c7f8373f" containerName="util" Mar 07 04:35:30 crc kubenswrapper[4689]: I0307 04:35:30.992911 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="789a8575-312c-4ce6-96e5-ccc4c7f8373f" containerName="util" Mar 07 04:35:30 crc kubenswrapper[4689]: I0307 04:35:30.993015 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="789a8575-312c-4ce6-96e5-ccc4c7f8373f" containerName="extract" Mar 07 04:35:30 crc kubenswrapper[4689]: I0307 04:35:30.993428 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:30 crc kubenswrapper[4689]: I0307 04:35:30.995184 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Mar 07 04:35:30 crc kubenswrapper[4689]: I0307 04:35:30.995680 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-w4p72" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.002651 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v"] Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.141316 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w859q\" (UniqueName: \"kubernetes.io/projected/2fbf4774-52d5-49ff-8066-d6363f88c3c5-kube-api-access-w859q\") pod \"keystone-operator-controller-manager-6c6bb6574-hcn5v\" (UID: \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\") " pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.141367 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2fbf4774-52d5-49ff-8066-d6363f88c3c5-webhook-cert\") pod \"keystone-operator-controller-manager-6c6bb6574-hcn5v\" (UID: \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\") " pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.141441 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2fbf4774-52d5-49ff-8066-d6363f88c3c5-apiservice-cert\") pod \"keystone-operator-controller-manager-6c6bb6574-hcn5v\" (UID: \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\") " pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.243410 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2fbf4774-52d5-49ff-8066-d6363f88c3c5-apiservice-cert\") pod \"keystone-operator-controller-manager-6c6bb6574-hcn5v\" (UID: \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\") " pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.243558 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w859q\" (UniqueName: \"kubernetes.io/projected/2fbf4774-52d5-49ff-8066-d6363f88c3c5-kube-api-access-w859q\") pod \"keystone-operator-controller-manager-6c6bb6574-hcn5v\" (UID: \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\") " pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.243627 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2fbf4774-52d5-49ff-8066-d6363f88c3c5-webhook-cert\") pod \"keystone-operator-controller-manager-6c6bb6574-hcn5v\" (UID: \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\") " pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.253243 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2fbf4774-52d5-49ff-8066-d6363f88c3c5-webhook-cert\") pod \"keystone-operator-controller-manager-6c6bb6574-hcn5v\" (UID: \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\") " pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.256725 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2fbf4774-52d5-49ff-8066-d6363f88c3c5-apiservice-cert\") pod \"keystone-operator-controller-manager-6c6bb6574-hcn5v\" (UID: \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\") " pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.266945 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w859q\" (UniqueName: \"kubernetes.io/projected/2fbf4774-52d5-49ff-8066-d6363f88c3c5-kube-api-access-w859q\") pod \"keystone-operator-controller-manager-6c6bb6574-hcn5v\" (UID: \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\") " pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.308114 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.322249 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gbz29"] Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.329953 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.336857 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbz29"] Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.449837 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7tsm\" (UniqueName: \"kubernetes.io/projected/64821447-7a0f-42eb-b837-b2b146564b00-kube-api-access-b7tsm\") pod \"redhat-marketplace-gbz29\" (UID: \"64821447-7a0f-42eb-b837-b2b146564b00\") " pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.449893 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64821447-7a0f-42eb-b837-b2b146564b00-catalog-content\") pod \"redhat-marketplace-gbz29\" (UID: \"64821447-7a0f-42eb-b837-b2b146564b00\") " pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.449964 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64821447-7a0f-42eb-b837-b2b146564b00-utilities\") pod \"redhat-marketplace-gbz29\" (UID: \"64821447-7a0f-42eb-b837-b2b146564b00\") " pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.551750 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7tsm\" (UniqueName: \"kubernetes.io/projected/64821447-7a0f-42eb-b837-b2b146564b00-kube-api-access-b7tsm\") pod \"redhat-marketplace-gbz29\" (UID: \"64821447-7a0f-42eb-b837-b2b146564b00\") " pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.552101 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64821447-7a0f-42eb-b837-b2b146564b00-catalog-content\") pod \"redhat-marketplace-gbz29\" (UID: \"64821447-7a0f-42eb-b837-b2b146564b00\") " pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.552159 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64821447-7a0f-42eb-b837-b2b146564b00-utilities\") pod \"redhat-marketplace-gbz29\" (UID: \"64821447-7a0f-42eb-b837-b2b146564b00\") " pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.552565 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64821447-7a0f-42eb-b837-b2b146564b00-catalog-content\") pod \"redhat-marketplace-gbz29\" (UID: \"64821447-7a0f-42eb-b837-b2b146564b00\") " pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.552595 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64821447-7a0f-42eb-b837-b2b146564b00-utilities\") pod \"redhat-marketplace-gbz29\" (UID: \"64821447-7a0f-42eb-b837-b2b146564b00\") " pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.574217 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7tsm\" (UniqueName: \"kubernetes.io/projected/64821447-7a0f-42eb-b837-b2b146564b00-kube-api-access-b7tsm\") pod \"redhat-marketplace-gbz29\" (UID: \"64821447-7a0f-42eb-b837-b2b146564b00\") " pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.695090 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:31 crc kubenswrapper[4689]: I0307 04:35:31.836388 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v"] Mar 07 04:35:32 crc kubenswrapper[4689]: I0307 04:35:32.141084 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbz29"] Mar 07 04:35:32 crc kubenswrapper[4689]: I0307 04:35:32.256624 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" event={"ID":"2fbf4774-52d5-49ff-8066-d6363f88c3c5","Type":"ContainerStarted","Data":"ccdc84d8b04dc030ad4ec723fbe5fc92629b27c5512bd0e70665c7bb8816d240"} Mar 07 04:35:32 crc kubenswrapper[4689]: I0307 04:35:32.258264 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbz29" event={"ID":"64821447-7a0f-42eb-b837-b2b146564b00","Type":"ContainerStarted","Data":"36cf94e252f5230db3a1059304237ccd6b11592f3054e1d3cb9f5b94f294a45c"} Mar 07 04:35:33 crc kubenswrapper[4689]: I0307 04:35:33.266957 4689 generic.go:334] "Generic (PLEG): container finished" podID="64821447-7a0f-42eb-b837-b2b146564b00" containerID="ccc85a57e36d88567e27c9ebd3cadbb0d375be99a3daac8ce64878f2b19a5faf" exitCode=0 Mar 07 04:35:33 crc kubenswrapper[4689]: I0307 04:35:33.267288 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbz29" event={"ID":"64821447-7a0f-42eb-b837-b2b146564b00","Type":"ContainerDied","Data":"ccc85a57e36d88567e27c9ebd3cadbb0d375be99a3daac8ce64878f2b19a5faf"} Mar 07 04:35:35 crc kubenswrapper[4689]: I0307 04:35:35.283986 4689 generic.go:334] "Generic (PLEG): container finished" podID="b8758a96-64ae-4c03-b392-5aa8c68cc641" containerID="df73ae97f7f931ce51d921b931364eadb91b0bc93313e06219d09747cc840f0e" exitCode=0 Mar 07 04:35:35 crc kubenswrapper[4689]: I0307 04:35:35.284095 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"b8758a96-64ae-4c03-b392-5aa8c68cc641","Type":"ContainerDied","Data":"df73ae97f7f931ce51d921b931364eadb91b0bc93313e06219d09747cc840f0e"} Mar 07 04:35:36 crc kubenswrapper[4689]: I0307 04:35:36.304233 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"b8758a96-64ae-4c03-b392-5aa8c68cc641","Type":"ContainerStarted","Data":"fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23"} Mar 07 04:35:36 crc kubenswrapper[4689]: I0307 04:35:36.304685 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:35:36 crc kubenswrapper[4689]: I0307 04:35:36.306094 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" event={"ID":"2fbf4774-52d5-49ff-8066-d6363f88c3c5","Type":"ContainerStarted","Data":"43dc6c8c5209facffc79f545ef744c2506053398b1e14289fb2ddb4ed03525e5"} Mar 07 04:35:36 crc kubenswrapper[4689]: I0307 04:35:36.306480 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:36 crc kubenswrapper[4689]: I0307 04:35:36.307820 4689 generic.go:334] "Generic (PLEG): container finished" podID="64821447-7a0f-42eb-b837-b2b146564b00" containerID="2ca2b0bc699dfe17fa7a31c49df9688bf28e3b0d8a3e0e29af1c5fe759dbe023" exitCode=0 Mar 07 04:35:36 crc kubenswrapper[4689]: I0307 04:35:36.307885 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbz29" event={"ID":"64821447-7a0f-42eb-b837-b2b146564b00","Type":"ContainerDied","Data":"2ca2b0bc699dfe17fa7a31c49df9688bf28e3b0d8a3e0e29af1c5fe759dbe023"} Mar 07 04:35:36 crc kubenswrapper[4689]: I0307 04:35:36.337578 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.419855659 podStartE2EDuration="43.337551735s" podCreationTimestamp="2026-03-07 04:34:53 +0000 UTC" firstStartedPulling="2026-03-07 04:34:55.1936325 +0000 UTC m=+940.240016029" lastFinishedPulling="2026-03-07 04:35:02.111328616 +0000 UTC m=+947.157712105" observedRunningTime="2026-03-07 04:35:36.336540918 +0000 UTC m=+981.382924447" watchObservedRunningTime="2026-03-07 04:35:36.337551735 +0000 UTC m=+981.383935254" Mar 07 04:35:36 crc kubenswrapper[4689]: I0307 04:35:36.375182 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" podStartSLOduration=2.829527304 podStartE2EDuration="6.375145047s" podCreationTimestamp="2026-03-07 04:35:30 +0000 UTC" firstStartedPulling="2026-03-07 04:35:31.840718784 +0000 UTC m=+976.887102273" lastFinishedPulling="2026-03-07 04:35:35.386336527 +0000 UTC m=+980.432720016" observedRunningTime="2026-03-07 04:35:36.369424523 +0000 UTC m=+981.415808022" watchObservedRunningTime="2026-03-07 04:35:36.375145047 +0000 UTC m=+981.421528556" Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.305598 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9cpft"] Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.307329 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.318916 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbz29" event={"ID":"64821447-7a0f-42eb-b837-b2b146564b00","Type":"ContainerStarted","Data":"6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b"} Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.327971 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9cpft"] Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.369254 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gbz29" podStartSLOduration=2.913449096 podStartE2EDuration="6.36923958s" podCreationTimestamp="2026-03-07 04:35:31 +0000 UTC" firstStartedPulling="2026-03-07 04:35:33.269416957 +0000 UTC m=+978.315800446" lastFinishedPulling="2026-03-07 04:35:36.725207411 +0000 UTC m=+981.771590930" observedRunningTime="2026-03-07 04:35:37.367213705 +0000 UTC m=+982.413597214" watchObservedRunningTime="2026-03-07 04:35:37.36923958 +0000 UTC m=+982.415623059" Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.446780 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd67439f-628c-46f1-a216-273ee2d1bb0f-utilities\") pod \"certified-operators-9cpft\" (UID: \"dd67439f-628c-46f1-a216-273ee2d1bb0f\") " pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.446839 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr8rk\" (UniqueName: \"kubernetes.io/projected/dd67439f-628c-46f1-a216-273ee2d1bb0f-kube-api-access-rr8rk\") pod \"certified-operators-9cpft\" (UID: \"dd67439f-628c-46f1-a216-273ee2d1bb0f\") " pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.446957 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd67439f-628c-46f1-a216-273ee2d1bb0f-catalog-content\") pod \"certified-operators-9cpft\" (UID: \"dd67439f-628c-46f1-a216-273ee2d1bb0f\") " pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.547977 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd67439f-628c-46f1-a216-273ee2d1bb0f-utilities\") pod \"certified-operators-9cpft\" (UID: \"dd67439f-628c-46f1-a216-273ee2d1bb0f\") " pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.548032 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr8rk\" (UniqueName: \"kubernetes.io/projected/dd67439f-628c-46f1-a216-273ee2d1bb0f-kube-api-access-rr8rk\") pod \"certified-operators-9cpft\" (UID: \"dd67439f-628c-46f1-a216-273ee2d1bb0f\") " pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.548115 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd67439f-628c-46f1-a216-273ee2d1bb0f-catalog-content\") pod \"certified-operators-9cpft\" (UID: \"dd67439f-628c-46f1-a216-273ee2d1bb0f\") " pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.548570 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd67439f-628c-46f1-a216-273ee2d1bb0f-utilities\") pod \"certified-operators-9cpft\" (UID: \"dd67439f-628c-46f1-a216-273ee2d1bb0f\") " pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.548620 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd67439f-628c-46f1-a216-273ee2d1bb0f-catalog-content\") pod \"certified-operators-9cpft\" (UID: \"dd67439f-628c-46f1-a216-273ee2d1bb0f\") " pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.577401 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr8rk\" (UniqueName: \"kubernetes.io/projected/dd67439f-628c-46f1-a216-273ee2d1bb0f-kube-api-access-rr8rk\") pod \"certified-operators-9cpft\" (UID: \"dd67439f-628c-46f1-a216-273ee2d1bb0f\") " pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:37 crc kubenswrapper[4689]: I0307 04:35:37.623203 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:38 crc kubenswrapper[4689]: I0307 04:35:38.105283 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9cpft"] Mar 07 04:35:38 crc kubenswrapper[4689]: W0307 04:35:38.109303 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd67439f_628c_46f1_a216_273ee2d1bb0f.slice/crio-248a00c87aeb690d2ddf485360dea1a944c65aff22a21e1311800a4389aaf278 WatchSource:0}: Error finding container 248a00c87aeb690d2ddf485360dea1a944c65aff22a21e1311800a4389aaf278: Status 404 returned error can't find the container with id 248a00c87aeb690d2ddf485360dea1a944c65aff22a21e1311800a4389aaf278 Mar 07 04:35:38 crc kubenswrapper[4689]: I0307 04:35:38.325408 4689 generic.go:334] "Generic (PLEG): container finished" podID="dd67439f-628c-46f1-a216-273ee2d1bb0f" containerID="adee63e3bb91bb5e073db34e85558be841665c13527e1368356ac5f438ed483b" exitCode=0 Mar 07 04:35:38 crc kubenswrapper[4689]: I0307 04:35:38.325493 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cpft" event={"ID":"dd67439f-628c-46f1-a216-273ee2d1bb0f","Type":"ContainerDied","Data":"adee63e3bb91bb5e073db34e85558be841665c13527e1368356ac5f438ed483b"} Mar 07 04:35:38 crc kubenswrapper[4689]: I0307 04:35:38.325538 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cpft" event={"ID":"dd67439f-628c-46f1-a216-273ee2d1bb0f","Type":"ContainerStarted","Data":"248a00c87aeb690d2ddf485360dea1a944c65aff22a21e1311800a4389aaf278"} Mar 07 04:35:39 crc kubenswrapper[4689]: I0307 04:35:39.332227 4689 generic.go:334] "Generic (PLEG): container finished" podID="dd67439f-628c-46f1-a216-273ee2d1bb0f" containerID="f674463bea7de68a169573ff355b679ceddd54cb82aebd0e1eea58467bdc64b9" exitCode=0 Mar 07 04:35:39 crc kubenswrapper[4689]: I0307 04:35:39.332318 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cpft" event={"ID":"dd67439f-628c-46f1-a216-273ee2d1bb0f","Type":"ContainerDied","Data":"f674463bea7de68a169573ff355b679ceddd54cb82aebd0e1eea58467bdc64b9"} Mar 07 04:35:40 crc kubenswrapper[4689]: I0307 04:35:40.341464 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cpft" event={"ID":"dd67439f-628c-46f1-a216-273ee2d1bb0f","Type":"ContainerStarted","Data":"5bdfce13dd5ba8d71fe8ada28a6f267e74144e024f6d21818178901bae282253"} Mar 07 04:35:40 crc kubenswrapper[4689]: I0307 04:35:40.362726 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9cpft" podStartSLOduration=1.931092687 podStartE2EDuration="3.362702838s" podCreationTimestamp="2026-03-07 04:35:37 +0000 UTC" firstStartedPulling="2026-03-07 04:35:38.326631804 +0000 UTC m=+983.373015293" lastFinishedPulling="2026-03-07 04:35:39.758241945 +0000 UTC m=+984.804625444" observedRunningTime="2026-03-07 04:35:40.3587187 +0000 UTC m=+985.405102199" watchObservedRunningTime="2026-03-07 04:35:40.362702838 +0000 UTC m=+985.409086367" Mar 07 04:35:41 crc kubenswrapper[4689]: I0307 04:35:41.315253 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:35:41 crc kubenswrapper[4689]: I0307 04:35:41.695795 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:41 crc kubenswrapper[4689]: I0307 04:35:41.696235 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:41 crc kubenswrapper[4689]: I0307 04:35:41.762019 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:42 crc kubenswrapper[4689]: I0307 04:35:42.413296 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.277527 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-create-v6n8c"] Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.278580 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-v6n8c" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.311274 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-v6n8c"] Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.350696 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-df65-account-create-update-tpgsg"] Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.351459 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.357557 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-db-secret" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.373544 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-df65-account-create-update-tpgsg"] Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.427486 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7d2q\" (UniqueName: \"kubernetes.io/projected/c3c0e0af-98ed-4f4b-a406-d883afe0395b-kube-api-access-v7d2q\") pod \"keystone-df65-account-create-update-tpgsg\" (UID: \"c3c0e0af-98ed-4f4b-a406-d883afe0395b\") " pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.427561 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52pwp\" (UniqueName: \"kubernetes.io/projected/d06beb76-5cac-4ccf-9478-9dcb7ba03aee-kube-api-access-52pwp\") pod \"keystone-db-create-v6n8c\" (UID: \"d06beb76-5cac-4ccf-9478-9dcb7ba03aee\") " pod="glance-kuttl-tests/keystone-db-create-v6n8c" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.427580 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3c0e0af-98ed-4f4b-a406-d883afe0395b-operator-scripts\") pod \"keystone-df65-account-create-update-tpgsg\" (UID: \"c3c0e0af-98ed-4f4b-a406-d883afe0395b\") " pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.427827 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d06beb76-5cac-4ccf-9478-9dcb7ba03aee-operator-scripts\") pod \"keystone-db-create-v6n8c\" (UID: \"d06beb76-5cac-4ccf-9478-9dcb7ba03aee\") " pod="glance-kuttl-tests/keystone-db-create-v6n8c" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.496978 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbz29"] Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.529365 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3c0e0af-98ed-4f4b-a406-d883afe0395b-operator-scripts\") pod \"keystone-df65-account-create-update-tpgsg\" (UID: \"c3c0e0af-98ed-4f4b-a406-d883afe0395b\") " pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.529442 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d06beb76-5cac-4ccf-9478-9dcb7ba03aee-operator-scripts\") pod \"keystone-db-create-v6n8c\" (UID: \"d06beb76-5cac-4ccf-9478-9dcb7ba03aee\") " pod="glance-kuttl-tests/keystone-db-create-v6n8c" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.529493 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7d2q\" (UniqueName: \"kubernetes.io/projected/c3c0e0af-98ed-4f4b-a406-d883afe0395b-kube-api-access-v7d2q\") pod \"keystone-df65-account-create-update-tpgsg\" (UID: \"c3c0e0af-98ed-4f4b-a406-d883afe0395b\") " pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.529533 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52pwp\" (UniqueName: \"kubernetes.io/projected/d06beb76-5cac-4ccf-9478-9dcb7ba03aee-kube-api-access-52pwp\") pod \"keystone-db-create-v6n8c\" (UID: \"d06beb76-5cac-4ccf-9478-9dcb7ba03aee\") " pod="glance-kuttl-tests/keystone-db-create-v6n8c" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.530267 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3c0e0af-98ed-4f4b-a406-d883afe0395b-operator-scripts\") pod \"keystone-df65-account-create-update-tpgsg\" (UID: \"c3c0e0af-98ed-4f4b-a406-d883afe0395b\") " pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.530399 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d06beb76-5cac-4ccf-9478-9dcb7ba03aee-operator-scripts\") pod \"keystone-db-create-v6n8c\" (UID: \"d06beb76-5cac-4ccf-9478-9dcb7ba03aee\") " pod="glance-kuttl-tests/keystone-db-create-v6n8c" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.552793 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52pwp\" (UniqueName: \"kubernetes.io/projected/d06beb76-5cac-4ccf-9478-9dcb7ba03aee-kube-api-access-52pwp\") pod \"keystone-db-create-v6n8c\" (UID: \"d06beb76-5cac-4ccf-9478-9dcb7ba03aee\") " pod="glance-kuttl-tests/keystone-db-create-v6n8c" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.566711 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7d2q\" (UniqueName: \"kubernetes.io/projected/c3c0e0af-98ed-4f4b-a406-d883afe0395b-kube-api-access-v7d2q\") pod \"keystone-df65-account-create-update-tpgsg\" (UID: \"c3c0e0af-98ed-4f4b-a406-d883afe0395b\") " pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.592817 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-v6n8c" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.665323 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" Mar 07 04:35:43 crc kubenswrapper[4689]: I0307 04:35:43.907065 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-v6n8c"] Mar 07 04:35:44 crc kubenswrapper[4689]: I0307 04:35:44.207209 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-df65-account-create-update-tpgsg"] Mar 07 04:35:44 crc kubenswrapper[4689]: W0307 04:35:44.215640 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3c0e0af_98ed_4f4b_a406_d883afe0395b.slice/crio-3bb044ba8bb22a3722c810259a435354e579645bd7ee96dee153ae7a0c5561b4 WatchSource:0}: Error finding container 3bb044ba8bb22a3722c810259a435354e579645bd7ee96dee153ae7a0c5561b4: Status 404 returned error can't find the container with id 3bb044ba8bb22a3722c810259a435354e579645bd7ee96dee153ae7a0c5561b4 Mar 07 04:35:44 crc kubenswrapper[4689]: I0307 04:35:44.373661 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" event={"ID":"c3c0e0af-98ed-4f4b-a406-d883afe0395b","Type":"ContainerStarted","Data":"3375165fd607e9989ebac042353510dc2d585b80cbb826cc8af79ab15a85781e"} Mar 07 04:35:44 crc kubenswrapper[4689]: I0307 04:35:44.373934 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" event={"ID":"c3c0e0af-98ed-4f4b-a406-d883afe0395b","Type":"ContainerStarted","Data":"3bb044ba8bb22a3722c810259a435354e579645bd7ee96dee153ae7a0c5561b4"} Mar 07 04:35:44 crc kubenswrapper[4689]: I0307 04:35:44.376057 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-v6n8c" event={"ID":"d06beb76-5cac-4ccf-9478-9dcb7ba03aee","Type":"ContainerStarted","Data":"6d84c35f61707c7b38c0f7abfad7b43658ceb65dc79a2d9b8c4aaeb55210b97f"} Mar 07 04:35:44 crc kubenswrapper[4689]: I0307 04:35:44.376100 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-v6n8c" event={"ID":"d06beb76-5cac-4ccf-9478-9dcb7ba03aee","Type":"ContainerStarted","Data":"5cad20bdd96af80d5fde8aaa5c38036199537f2e9d373a21344fcc0e25d6a2f8"} Mar 07 04:35:44 crc kubenswrapper[4689]: I0307 04:35:44.388777 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" podStartSLOduration=1.3887587639999999 podStartE2EDuration="1.388758764s" podCreationTimestamp="2026-03-07 04:35:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:35:44.386974326 +0000 UTC m=+989.433357825" watchObservedRunningTime="2026-03-07 04:35:44.388758764 +0000 UTC m=+989.435142263" Mar 07 04:35:44 crc kubenswrapper[4689]: I0307 04:35:44.406415 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-create-v6n8c" podStartSLOduration=1.406389119 podStartE2EDuration="1.406389119s" podCreationTimestamp="2026-03-07 04:35:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:35:44.401816866 +0000 UTC m=+989.448200385" watchObservedRunningTime="2026-03-07 04:35:44.406389119 +0000 UTC m=+989.452772618" Mar 07 04:35:45 crc kubenswrapper[4689]: I0307 04:35:45.386028 4689 generic.go:334] "Generic (PLEG): container finished" podID="c3c0e0af-98ed-4f4b-a406-d883afe0395b" containerID="3375165fd607e9989ebac042353510dc2d585b80cbb826cc8af79ab15a85781e" exitCode=0 Mar 07 04:35:45 crc kubenswrapper[4689]: I0307 04:35:45.386100 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" event={"ID":"c3c0e0af-98ed-4f4b-a406-d883afe0395b","Type":"ContainerDied","Data":"3375165fd607e9989ebac042353510dc2d585b80cbb826cc8af79ab15a85781e"} Mar 07 04:35:45 crc kubenswrapper[4689]: I0307 04:35:45.389093 4689 generic.go:334] "Generic (PLEG): container finished" podID="d06beb76-5cac-4ccf-9478-9dcb7ba03aee" containerID="6d84c35f61707c7b38c0f7abfad7b43658ceb65dc79a2d9b8c4aaeb55210b97f" exitCode=0 Mar 07 04:35:45 crc kubenswrapper[4689]: I0307 04:35:45.389211 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-v6n8c" event={"ID":"d06beb76-5cac-4ccf-9478-9dcb7ba03aee","Type":"ContainerDied","Data":"6d84c35f61707c7b38c0f7abfad7b43658ceb65dc79a2d9b8c4aaeb55210b97f"} Mar 07 04:35:45 crc kubenswrapper[4689]: I0307 04:35:45.389457 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gbz29" podUID="64821447-7a0f-42eb-b837-b2b146564b00" containerName="registry-server" containerID="cri-o://6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b" gracePeriod=2 Mar 07 04:35:45 crc kubenswrapper[4689]: I0307 04:35:45.946020 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.078557 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64821447-7a0f-42eb-b837-b2b146564b00-utilities\") pod \"64821447-7a0f-42eb-b837-b2b146564b00\" (UID: \"64821447-7a0f-42eb-b837-b2b146564b00\") " Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.078661 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7tsm\" (UniqueName: \"kubernetes.io/projected/64821447-7a0f-42eb-b837-b2b146564b00-kube-api-access-b7tsm\") pod \"64821447-7a0f-42eb-b837-b2b146564b00\" (UID: \"64821447-7a0f-42eb-b837-b2b146564b00\") " Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.078716 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64821447-7a0f-42eb-b837-b2b146564b00-catalog-content\") pod \"64821447-7a0f-42eb-b837-b2b146564b00\" (UID: \"64821447-7a0f-42eb-b837-b2b146564b00\") " Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.079611 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64821447-7a0f-42eb-b837-b2b146564b00-utilities" (OuterVolumeSpecName: "utilities") pod "64821447-7a0f-42eb-b837-b2b146564b00" (UID: "64821447-7a0f-42eb-b837-b2b146564b00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.085133 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64821447-7a0f-42eb-b837-b2b146564b00-kube-api-access-b7tsm" (OuterVolumeSpecName: "kube-api-access-b7tsm") pod "64821447-7a0f-42eb-b837-b2b146564b00" (UID: "64821447-7a0f-42eb-b837-b2b146564b00"). InnerVolumeSpecName "kube-api-access-b7tsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.106854 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rngx5"] Mar 07 04:35:46 crc kubenswrapper[4689]: E0307 04:35:46.107173 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64821447-7a0f-42eb-b837-b2b146564b00" containerName="extract-utilities" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.107199 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="64821447-7a0f-42eb-b837-b2b146564b00" containerName="extract-utilities" Mar 07 04:35:46 crc kubenswrapper[4689]: E0307 04:35:46.107219 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64821447-7a0f-42eb-b837-b2b146564b00" containerName="extract-content" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.107225 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="64821447-7a0f-42eb-b837-b2b146564b00" containerName="extract-content" Mar 07 04:35:46 crc kubenswrapper[4689]: E0307 04:35:46.107235 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64821447-7a0f-42eb-b837-b2b146564b00" containerName="registry-server" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.107241 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="64821447-7a0f-42eb-b837-b2b146564b00" containerName="registry-server" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.107349 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="64821447-7a0f-42eb-b837-b2b146564b00" containerName="registry-server" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.108195 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.109477 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64821447-7a0f-42eb-b837-b2b146564b00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64821447-7a0f-42eb-b837-b2b146564b00" (UID: "64821447-7a0f-42eb-b837-b2b146564b00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.119659 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rngx5"] Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.180008 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64821447-7a0f-42eb-b837-b2b146564b00-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.180040 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7tsm\" (UniqueName: \"kubernetes.io/projected/64821447-7a0f-42eb-b837-b2b146564b00-kube-api-access-b7tsm\") on node \"crc\" DevicePath \"\"" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.180051 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64821447-7a0f-42eb-b837-b2b146564b00-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.281314 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t464\" (UniqueName: \"kubernetes.io/projected/92d8c131-7403-441a-8112-7dcd003edf5f-kube-api-access-4t464\") pod \"community-operators-rngx5\" (UID: \"92d8c131-7403-441a-8112-7dcd003edf5f\") " pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.281400 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d8c131-7403-441a-8112-7dcd003edf5f-utilities\") pod \"community-operators-rngx5\" (UID: \"92d8c131-7403-441a-8112-7dcd003edf5f\") " pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.281433 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d8c131-7403-441a-8112-7dcd003edf5f-catalog-content\") pod \"community-operators-rngx5\" (UID: \"92d8c131-7403-441a-8112-7dcd003edf5f\") " pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.383031 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d8c131-7403-441a-8112-7dcd003edf5f-catalog-content\") pod \"community-operators-rngx5\" (UID: \"92d8c131-7403-441a-8112-7dcd003edf5f\") " pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.383111 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t464\" (UniqueName: \"kubernetes.io/projected/92d8c131-7403-441a-8112-7dcd003edf5f-kube-api-access-4t464\") pod \"community-operators-rngx5\" (UID: \"92d8c131-7403-441a-8112-7dcd003edf5f\") " pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.383171 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d8c131-7403-441a-8112-7dcd003edf5f-utilities\") pod \"community-operators-rngx5\" (UID: \"92d8c131-7403-441a-8112-7dcd003edf5f\") " pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.383628 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d8c131-7403-441a-8112-7dcd003edf5f-utilities\") pod \"community-operators-rngx5\" (UID: \"92d8c131-7403-441a-8112-7dcd003edf5f\") " pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.383840 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d8c131-7403-441a-8112-7dcd003edf5f-catalog-content\") pod \"community-operators-rngx5\" (UID: \"92d8c131-7403-441a-8112-7dcd003edf5f\") " pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.396752 4689 generic.go:334] "Generic (PLEG): container finished" podID="64821447-7a0f-42eb-b837-b2b146564b00" containerID="6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b" exitCode=0 Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.396794 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbz29" event={"ID":"64821447-7a0f-42eb-b837-b2b146564b00","Type":"ContainerDied","Data":"6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b"} Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.396854 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbz29" event={"ID":"64821447-7a0f-42eb-b837-b2b146564b00","Type":"ContainerDied","Data":"36cf94e252f5230db3a1059304237ccd6b11592f3054e1d3cb9f5b94f294a45c"} Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.396858 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbz29" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.396873 4689 scope.go:117] "RemoveContainer" containerID="6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.414723 4689 scope.go:117] "RemoveContainer" containerID="2ca2b0bc699dfe17fa7a31c49df9688bf28e3b0d8a3e0e29af1c5fe759dbe023" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.424963 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t464\" (UniqueName: \"kubernetes.io/projected/92d8c131-7403-441a-8112-7dcd003edf5f-kube-api-access-4t464\") pod \"community-operators-rngx5\" (UID: \"92d8c131-7403-441a-8112-7dcd003edf5f\") " pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.436969 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.451133 4689 scope.go:117] "RemoveContainer" containerID="ccc85a57e36d88567e27c9ebd3cadbb0d375be99a3daac8ce64878f2b19a5faf" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.451803 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbz29"] Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.456367 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbz29"] Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.487400 4689 scope.go:117] "RemoveContainer" containerID="6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b" Mar 07 04:35:46 crc kubenswrapper[4689]: E0307 04:35:46.487963 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b\": container with ID starting with 6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b not found: ID does not exist" containerID="6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.487994 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b"} err="failed to get container status \"6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b\": rpc error: code = NotFound desc = could not find container \"6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b\": container with ID starting with 6ef4ed7dcbf276aa9b626a3b9340cbb01b9144f416f2c48202a8649b1ff8845b not found: ID does not exist" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.488017 4689 scope.go:117] "RemoveContainer" containerID="2ca2b0bc699dfe17fa7a31c49df9688bf28e3b0d8a3e0e29af1c5fe759dbe023" Mar 07 04:35:46 crc kubenswrapper[4689]: E0307 04:35:46.489203 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ca2b0bc699dfe17fa7a31c49df9688bf28e3b0d8a3e0e29af1c5fe759dbe023\": container with ID starting with 2ca2b0bc699dfe17fa7a31c49df9688bf28e3b0d8a3e0e29af1c5fe759dbe023 not found: ID does not exist" containerID="2ca2b0bc699dfe17fa7a31c49df9688bf28e3b0d8a3e0e29af1c5fe759dbe023" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.489249 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca2b0bc699dfe17fa7a31c49df9688bf28e3b0d8a3e0e29af1c5fe759dbe023"} err="failed to get container status \"2ca2b0bc699dfe17fa7a31c49df9688bf28e3b0d8a3e0e29af1c5fe759dbe023\": rpc error: code = NotFound desc = could not find container \"2ca2b0bc699dfe17fa7a31c49df9688bf28e3b0d8a3e0e29af1c5fe759dbe023\": container with ID starting with 2ca2b0bc699dfe17fa7a31c49df9688bf28e3b0d8a3e0e29af1c5fe759dbe023 not found: ID does not exist" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.489277 4689 scope.go:117] "RemoveContainer" containerID="ccc85a57e36d88567e27c9ebd3cadbb0d375be99a3daac8ce64878f2b19a5faf" Mar 07 04:35:46 crc kubenswrapper[4689]: E0307 04:35:46.490407 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccc85a57e36d88567e27c9ebd3cadbb0d375be99a3daac8ce64878f2b19a5faf\": container with ID starting with ccc85a57e36d88567e27c9ebd3cadbb0d375be99a3daac8ce64878f2b19a5faf not found: ID does not exist" containerID="ccc85a57e36d88567e27c9ebd3cadbb0d375be99a3daac8ce64878f2b19a5faf" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.490448 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccc85a57e36d88567e27c9ebd3cadbb0d375be99a3daac8ce64878f2b19a5faf"} err="failed to get container status \"ccc85a57e36d88567e27c9ebd3cadbb0d375be99a3daac8ce64878f2b19a5faf\": rpc error: code = NotFound desc = could not find container \"ccc85a57e36d88567e27c9ebd3cadbb0d375be99a3daac8ce64878f2b19a5faf\": container with ID starting with ccc85a57e36d88567e27c9ebd3cadbb0d375be99a3daac8ce64878f2b19a5faf not found: ID does not exist" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.754481 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-v6n8c" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.756522 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.893829 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d06beb76-5cac-4ccf-9478-9dcb7ba03aee-operator-scripts\") pod \"d06beb76-5cac-4ccf-9478-9dcb7ba03aee\" (UID: \"d06beb76-5cac-4ccf-9478-9dcb7ba03aee\") " Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.893892 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3c0e0af-98ed-4f4b-a406-d883afe0395b-operator-scripts\") pod \"c3c0e0af-98ed-4f4b-a406-d883afe0395b\" (UID: \"c3c0e0af-98ed-4f4b-a406-d883afe0395b\") " Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.893926 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7d2q\" (UniqueName: \"kubernetes.io/projected/c3c0e0af-98ed-4f4b-a406-d883afe0395b-kube-api-access-v7d2q\") pod \"c3c0e0af-98ed-4f4b-a406-d883afe0395b\" (UID: \"c3c0e0af-98ed-4f4b-a406-d883afe0395b\") " Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.893958 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52pwp\" (UniqueName: \"kubernetes.io/projected/d06beb76-5cac-4ccf-9478-9dcb7ba03aee-kube-api-access-52pwp\") pod \"d06beb76-5cac-4ccf-9478-9dcb7ba03aee\" (UID: \"d06beb76-5cac-4ccf-9478-9dcb7ba03aee\") " Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.894707 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c0e0af-98ed-4f4b-a406-d883afe0395b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3c0e0af-98ed-4f4b-a406-d883afe0395b" (UID: "c3c0e0af-98ed-4f4b-a406-d883afe0395b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.894897 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d06beb76-5cac-4ccf-9478-9dcb7ba03aee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d06beb76-5cac-4ccf-9478-9dcb7ba03aee" (UID: "d06beb76-5cac-4ccf-9478-9dcb7ba03aee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.897976 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c0e0af-98ed-4f4b-a406-d883afe0395b-kube-api-access-v7d2q" (OuterVolumeSpecName: "kube-api-access-v7d2q") pod "c3c0e0af-98ed-4f4b-a406-d883afe0395b" (UID: "c3c0e0af-98ed-4f4b-a406-d883afe0395b"). InnerVolumeSpecName "kube-api-access-v7d2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.901415 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06beb76-5cac-4ccf-9478-9dcb7ba03aee-kube-api-access-52pwp" (OuterVolumeSpecName: "kube-api-access-52pwp") pod "d06beb76-5cac-4ccf-9478-9dcb7ba03aee" (UID: "d06beb76-5cac-4ccf-9478-9dcb7ba03aee"). InnerVolumeSpecName "kube-api-access-52pwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.995564 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d06beb76-5cac-4ccf-9478-9dcb7ba03aee-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.995605 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3c0e0af-98ed-4f4b-a406-d883afe0395b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.995617 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7d2q\" (UniqueName: \"kubernetes.io/projected/c3c0e0af-98ed-4f4b-a406-d883afe0395b-kube-api-access-v7d2q\") on node \"crc\" DevicePath \"\"" Mar 07 04:35:46 crc kubenswrapper[4689]: I0307 04:35:46.995629 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52pwp\" (UniqueName: \"kubernetes.io/projected/d06beb76-5cac-4ccf-9478-9dcb7ba03aee-kube-api-access-52pwp\") on node \"crc\" DevicePath \"\"" Mar 07 04:35:47 crc kubenswrapper[4689]: W0307 04:35:47.023008 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d8c131_7403_441a_8112_7dcd003edf5f.slice/crio-8ef5f0d1ac05dc2311ccaa323b3369a441ad57c9726c15f334241c628231e871 WatchSource:0}: Error finding container 8ef5f0d1ac05dc2311ccaa323b3369a441ad57c9726c15f334241c628231e871: Status 404 returned error can't find the container with id 8ef5f0d1ac05dc2311ccaa323b3369a441ad57c9726c15f334241c628231e871 Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.023637 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rngx5"] Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.407406 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-v6n8c" Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.407455 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-v6n8c" event={"ID":"d06beb76-5cac-4ccf-9478-9dcb7ba03aee","Type":"ContainerDied","Data":"5cad20bdd96af80d5fde8aaa5c38036199537f2e9d373a21344fcc0e25d6a2f8"} Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.407830 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cad20bdd96af80d5fde8aaa5c38036199537f2e9d373a21344fcc0e25d6a2f8" Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.411403 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.411398 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-df65-account-create-update-tpgsg" event={"ID":"c3c0e0af-98ed-4f4b-a406-d883afe0395b","Type":"ContainerDied","Data":"3bb044ba8bb22a3722c810259a435354e579645bd7ee96dee153ae7a0c5561b4"} Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.411571 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bb044ba8bb22a3722c810259a435354e579645bd7ee96dee153ae7a0c5561b4" Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.413448 4689 generic.go:334] "Generic (PLEG): container finished" podID="92d8c131-7403-441a-8112-7dcd003edf5f" containerID="df3b93ae3552506ea438d14336fc029052e4d8bdd7116a7e6a768e5406f93965" exitCode=0 Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.413475 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rngx5" event={"ID":"92d8c131-7403-441a-8112-7dcd003edf5f","Type":"ContainerDied","Data":"df3b93ae3552506ea438d14336fc029052e4d8bdd7116a7e6a768e5406f93965"} Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.413496 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rngx5" event={"ID":"92d8c131-7403-441a-8112-7dcd003edf5f","Type":"ContainerStarted","Data":"8ef5f0d1ac05dc2311ccaa323b3369a441ad57c9726c15f334241c628231e871"} Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.623629 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.623709 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.670562 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:47 crc kubenswrapper[4689]: I0307 04:35:47.860261 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64821447-7a0f-42eb-b837-b2b146564b00" path="/var/lib/kubelet/pods/64821447-7a0f-42eb-b837-b2b146564b00/volumes" Mar 07 04:35:48 crc kubenswrapper[4689]: I0307 04:35:48.518003 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:35:49 crc kubenswrapper[4689]: I0307 04:35:49.442294 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rngx5" event={"ID":"92d8c131-7403-441a-8112-7dcd003edf5f","Type":"ContainerStarted","Data":"3cd7bb69a72a2c3b495a3e3991ebca378ef1f179aa3ccade7ec2a266b759a75e"} Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.116513 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-2rfwv"] Mar 07 04:35:50 crc kubenswrapper[4689]: E0307 04:35:50.116914 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06beb76-5cac-4ccf-9478-9dcb7ba03aee" containerName="mariadb-database-create" Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.116936 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06beb76-5cac-4ccf-9478-9dcb7ba03aee" containerName="mariadb-database-create" Mar 07 04:35:50 crc kubenswrapper[4689]: E0307 04:35:50.116955 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c0e0af-98ed-4f4b-a406-d883afe0395b" containerName="mariadb-account-create-update" Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.116971 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c0e0af-98ed-4f4b-a406-d883afe0395b" containerName="mariadb-account-create-update" Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.117236 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3c0e0af-98ed-4f4b-a406-d883afe0395b" containerName="mariadb-account-create-update" Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.117269 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06beb76-5cac-4ccf-9478-9dcb7ba03aee" containerName="mariadb-database-create" Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.117949 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-2rfwv" Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.122333 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-px8mz" Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.143310 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-2rfwv"] Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.245251 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rbtl\" (UniqueName: \"kubernetes.io/projected/4fbda293-a134-43ca-8f42-6bc32bae4b57-kube-api-access-4rbtl\") pod \"horizon-operator-index-2rfwv\" (UID: \"4fbda293-a134-43ca-8f42-6bc32bae4b57\") " pod="openstack-operators/horizon-operator-index-2rfwv" Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.347294 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rbtl\" (UniqueName: \"kubernetes.io/projected/4fbda293-a134-43ca-8f42-6bc32bae4b57-kube-api-access-4rbtl\") pod \"horizon-operator-index-2rfwv\" (UID: \"4fbda293-a134-43ca-8f42-6bc32bae4b57\") " pod="openstack-operators/horizon-operator-index-2rfwv" Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.380852 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rbtl\" (UniqueName: \"kubernetes.io/projected/4fbda293-a134-43ca-8f42-6bc32bae4b57-kube-api-access-4rbtl\") pod \"horizon-operator-index-2rfwv\" (UID: \"4fbda293-a134-43ca-8f42-6bc32bae4b57\") " pod="openstack-operators/horizon-operator-index-2rfwv" Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.438999 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-2rfwv" Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.455371 4689 generic.go:334] "Generic (PLEG): container finished" podID="92d8c131-7403-441a-8112-7dcd003edf5f" containerID="3cd7bb69a72a2c3b495a3e3991ebca378ef1f179aa3ccade7ec2a266b759a75e" exitCode=0 Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.455461 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rngx5" event={"ID":"92d8c131-7403-441a-8112-7dcd003edf5f","Type":"ContainerDied","Data":"3cd7bb69a72a2c3b495a3e3991ebca378ef1f179aa3ccade7ec2a266b759a75e"} Mar 07 04:35:50 crc kubenswrapper[4689]: I0307 04:35:50.949725 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-2rfwv"] Mar 07 04:35:51 crc kubenswrapper[4689]: I0307 04:35:51.464176 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-2rfwv" event={"ID":"4fbda293-a134-43ca-8f42-6bc32bae4b57","Type":"ContainerStarted","Data":"76360d4b6a07701e49c02324df14dc4883065c171f974cc9f7fc8ac700f86971"} Mar 07 04:35:51 crc kubenswrapper[4689]: I0307 04:35:51.471115 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rngx5" event={"ID":"92d8c131-7403-441a-8112-7dcd003edf5f","Type":"ContainerStarted","Data":"8a70e1565776e9deffcd78d8172afed1516ead9cf6efd334ed6cc17dc1fd463b"} Mar 07 04:35:51 crc kubenswrapper[4689]: I0307 04:35:51.491439 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rngx5" podStartSLOduration=2.048624003 podStartE2EDuration="5.491418626s" podCreationTimestamp="2026-03-07 04:35:46 +0000 UTC" firstStartedPulling="2026-03-07 04:35:47.416730332 +0000 UTC m=+992.463113871" lastFinishedPulling="2026-03-07 04:35:50.859525015 +0000 UTC m=+995.905908494" observedRunningTime="2026-03-07 04:35:51.486674698 +0000 UTC m=+996.533058207" watchObservedRunningTime="2026-03-07 04:35:51.491418626 +0000 UTC m=+996.537802125" Mar 07 04:35:52 crc kubenswrapper[4689]: I0307 04:35:52.481383 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-2rfwv" event={"ID":"4fbda293-a134-43ca-8f42-6bc32bae4b57","Type":"ContainerStarted","Data":"3dcde4284c638c48320217bbd8e310ec45eb950f0a7a31f9ae330ea133cbf5fe"} Mar 07 04:35:52 crc kubenswrapper[4689]: I0307 04:35:52.506824 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-2rfwv" podStartSLOduration=1.682021938 podStartE2EDuration="2.506805361s" podCreationTimestamp="2026-03-07 04:35:50 +0000 UTC" firstStartedPulling="2026-03-07 04:35:50.972037394 +0000 UTC m=+996.018420883" lastFinishedPulling="2026-03-07 04:35:51.796820817 +0000 UTC m=+996.843204306" observedRunningTime="2026-03-07 04:35:52.502626689 +0000 UTC m=+997.549010188" watchObservedRunningTime="2026-03-07 04:35:52.506805361 +0000 UTC m=+997.553188860" Mar 07 04:35:54 crc kubenswrapper[4689]: I0307 04:35:54.667586 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:35:54 crc kubenswrapper[4689]: I0307 04:35:54.907372 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-5pqgx"] Mar 07 04:35:54 crc kubenswrapper[4689]: I0307 04:35:54.908643 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-5pqgx" Mar 07 04:35:54 crc kubenswrapper[4689]: I0307 04:35:54.913367 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-2rp9s" Mar 07 04:35:54 crc kubenswrapper[4689]: I0307 04:35:54.918094 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-5pqgx"] Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.017322 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjpzf\" (UniqueName: \"kubernetes.io/projected/7eb6e990-66c3-471d-b9b7-8a82f5652638-kube-api-access-xjpzf\") pod \"swift-operator-index-5pqgx\" (UID: \"7eb6e990-66c3-471d-b9b7-8a82f5652638\") " pod="openstack-operators/swift-operator-index-5pqgx" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.118592 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjpzf\" (UniqueName: \"kubernetes.io/projected/7eb6e990-66c3-471d-b9b7-8a82f5652638-kube-api-access-xjpzf\") pod \"swift-operator-index-5pqgx\" (UID: \"7eb6e990-66c3-471d-b9b7-8a82f5652638\") " pod="openstack-operators/swift-operator-index-5pqgx" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.140361 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjpzf\" (UniqueName: \"kubernetes.io/projected/7eb6e990-66c3-471d-b9b7-8a82f5652638-kube-api-access-xjpzf\") pod \"swift-operator-index-5pqgx\" (UID: \"7eb6e990-66c3-471d-b9b7-8a82f5652638\") " pod="openstack-operators/swift-operator-index-5pqgx" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.234079 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-5pqgx" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.256856 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-sync-vxm85"] Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.258673 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-vxm85" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.284621 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.284639 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.284800 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.284862 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-jz6fk" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.291630 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-vxm85"] Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.341030 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1-config-data\") pod \"keystone-db-sync-vxm85\" (UID: \"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1\") " pod="glance-kuttl-tests/keystone-db-sync-vxm85" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.341120 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlfqv\" (UniqueName: \"kubernetes.io/projected/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1-kube-api-access-qlfqv\") pod \"keystone-db-sync-vxm85\" (UID: \"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1\") " pod="glance-kuttl-tests/keystone-db-sync-vxm85" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.442411 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1-config-data\") pod \"keystone-db-sync-vxm85\" (UID: \"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1\") " pod="glance-kuttl-tests/keystone-db-sync-vxm85" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.442817 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlfqv\" (UniqueName: \"kubernetes.io/projected/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1-kube-api-access-qlfqv\") pod \"keystone-db-sync-vxm85\" (UID: \"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1\") " pod="glance-kuttl-tests/keystone-db-sync-vxm85" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.449485 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1-config-data\") pod \"keystone-db-sync-vxm85\" (UID: \"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1\") " pod="glance-kuttl-tests/keystone-db-sync-vxm85" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.464134 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlfqv\" (UniqueName: \"kubernetes.io/projected/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1-kube-api-access-qlfqv\") pod \"keystone-db-sync-vxm85\" (UID: \"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1\") " pod="glance-kuttl-tests/keystone-db-sync-vxm85" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.618923 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-vxm85" Mar 07 04:35:55 crc kubenswrapper[4689]: I0307 04:35:55.739590 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-5pqgx"] Mar 07 04:35:56 crc kubenswrapper[4689]: W0307 04:35:56.038569 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6c1a0bd_8705_4cd3_9ae6_db1e3bf87bd1.slice/crio-fafcd01de1997e99cc80c5f5e330c2494174ca418f73de6150563f9650c22050 WatchSource:0}: Error finding container fafcd01de1997e99cc80c5f5e330c2494174ca418f73de6150563f9650c22050: Status 404 returned error can't find the container with id fafcd01de1997e99cc80c5f5e330c2494174ca418f73de6150563f9650c22050 Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.041169 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-vxm85"] Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.313890 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vsggs"] Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.316120 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.333773 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vsggs"] Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.438413 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.438499 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.457425 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbkn7\" (UniqueName: \"kubernetes.io/projected/f5158f39-08c1-467c-a67f-360dd799f42f-kube-api-access-bbkn7\") pod \"redhat-operators-vsggs\" (UID: \"f5158f39-08c1-467c-a67f-360dd799f42f\") " pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.457492 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5158f39-08c1-467c-a67f-360dd799f42f-catalog-content\") pod \"redhat-operators-vsggs\" (UID: \"f5158f39-08c1-467c-a67f-360dd799f42f\") " pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.457726 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5158f39-08c1-467c-a67f-360dd799f42f-utilities\") pod \"redhat-operators-vsggs\" (UID: \"f5158f39-08c1-467c-a67f-360dd799f42f\") " pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.497104 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.517622 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-vxm85" event={"ID":"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1","Type":"ContainerStarted","Data":"fafcd01de1997e99cc80c5f5e330c2494174ca418f73de6150563f9650c22050"} Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.521482 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-5pqgx" event={"ID":"7eb6e990-66c3-471d-b9b7-8a82f5652638","Type":"ContainerStarted","Data":"f8476293da548218427748dd425ef2ab582da898c20856eb903f762a7b2613b4"} Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.559048 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbkn7\" (UniqueName: \"kubernetes.io/projected/f5158f39-08c1-467c-a67f-360dd799f42f-kube-api-access-bbkn7\") pod \"redhat-operators-vsggs\" (UID: \"f5158f39-08c1-467c-a67f-360dd799f42f\") " pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.559122 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5158f39-08c1-467c-a67f-360dd799f42f-catalog-content\") pod \"redhat-operators-vsggs\" (UID: \"f5158f39-08c1-467c-a67f-360dd799f42f\") " pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.559239 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5158f39-08c1-467c-a67f-360dd799f42f-utilities\") pod \"redhat-operators-vsggs\" (UID: \"f5158f39-08c1-467c-a67f-360dd799f42f\") " pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.560051 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5158f39-08c1-467c-a67f-360dd799f42f-catalog-content\") pod \"redhat-operators-vsggs\" (UID: \"f5158f39-08c1-467c-a67f-360dd799f42f\") " pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.560062 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5158f39-08c1-467c-a67f-360dd799f42f-utilities\") pod \"redhat-operators-vsggs\" (UID: \"f5158f39-08c1-467c-a67f-360dd799f42f\") " pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.588111 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbkn7\" (UniqueName: \"kubernetes.io/projected/f5158f39-08c1-467c-a67f-360dd799f42f-kube-api-access-bbkn7\") pod \"redhat-operators-vsggs\" (UID: \"f5158f39-08c1-467c-a67f-360dd799f42f\") " pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.608989 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:35:56 crc kubenswrapper[4689]: I0307 04:35:56.649228 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:35:57 crc kubenswrapper[4689]: I0307 04:35:57.235730 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vsggs"] Mar 07 04:35:57 crc kubenswrapper[4689]: I0307 04:35:57.531364 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsggs" event={"ID":"f5158f39-08c1-467c-a67f-360dd799f42f","Type":"ContainerDied","Data":"5735fbaca5ce5b323cf225095180ea5d8accb129a8ae346fb8f77f49cf4d8fd7"} Mar 07 04:35:57 crc kubenswrapper[4689]: I0307 04:35:57.532036 4689 generic.go:334] "Generic (PLEG): container finished" podID="f5158f39-08c1-467c-a67f-360dd799f42f" containerID="5735fbaca5ce5b323cf225095180ea5d8accb129a8ae346fb8f77f49cf4d8fd7" exitCode=0 Mar 07 04:35:57 crc kubenswrapper[4689]: I0307 04:35:57.532102 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsggs" event={"ID":"f5158f39-08c1-467c-a67f-360dd799f42f","Type":"ContainerStarted","Data":"8f7cf3daf054bdb9e4635fc3c39d8e15baf0a9f63e3be007cd71171eb034f0b5"} Mar 07 04:35:57 crc kubenswrapper[4689]: I0307 04:35:57.536848 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-5pqgx" event={"ID":"7eb6e990-66c3-471d-b9b7-8a82f5652638","Type":"ContainerStarted","Data":"b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6"} Mar 07 04:35:57 crc kubenswrapper[4689]: I0307 04:35:57.564297 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-5pqgx" podStartSLOduration=2.498836342 podStartE2EDuration="3.564262965s" podCreationTimestamp="2026-03-07 04:35:54 +0000 UTC" firstStartedPulling="2026-03-07 04:35:55.749129389 +0000 UTC m=+1000.795512878" lastFinishedPulling="2026-03-07 04:35:56.814556012 +0000 UTC m=+1001.860939501" observedRunningTime="2026-03-07 04:35:57.560906114 +0000 UTC m=+1002.607289613" watchObservedRunningTime="2026-03-07 04:35:57.564262965 +0000 UTC m=+1002.610646454" Mar 07 04:35:58 crc kubenswrapper[4689]: I0307 04:35:58.550309 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsggs" event={"ID":"f5158f39-08c1-467c-a67f-360dd799f42f","Type":"ContainerStarted","Data":"38e0b1ac35ced597c9c13d0e04cb654892d5175c42b15257741c01742bbf2984"} Mar 07 04:35:59 crc kubenswrapper[4689]: I0307 04:35:59.189941 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:35:59 crc kubenswrapper[4689]: I0307 04:35:59.190001 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:35:59 crc kubenswrapper[4689]: I0307 04:35:59.559734 4689 generic.go:334] "Generic (PLEG): container finished" podID="f5158f39-08c1-467c-a67f-360dd799f42f" containerID="38e0b1ac35ced597c9c13d0e04cb654892d5175c42b15257741c01742bbf2984" exitCode=0 Mar 07 04:35:59 crc kubenswrapper[4689]: I0307 04:35:59.559798 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsggs" event={"ID":"f5158f39-08c1-467c-a67f-360dd799f42f","Type":"ContainerDied","Data":"38e0b1ac35ced597c9c13d0e04cb654892d5175c42b15257741c01742bbf2984"} Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.130821 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547636-h7t9g"] Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.132853 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547636-h7t9g" Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.136392 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.136461 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.136743 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.140899 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547636-h7t9g"] Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.227391 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl8jq\" (UniqueName: \"kubernetes.io/projected/cd546d8b-6b91-4fbe-91a0-b16532fc2759-kube-api-access-fl8jq\") pod \"auto-csr-approver-29547636-h7t9g\" (UID: \"cd546d8b-6b91-4fbe-91a0-b16532fc2759\") " pod="openshift-infra/auto-csr-approver-29547636-h7t9g" Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.328946 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl8jq\" (UniqueName: \"kubernetes.io/projected/cd546d8b-6b91-4fbe-91a0-b16532fc2759-kube-api-access-fl8jq\") pod \"auto-csr-approver-29547636-h7t9g\" (UID: \"cd546d8b-6b91-4fbe-91a0-b16532fc2759\") " pod="openshift-infra/auto-csr-approver-29547636-h7t9g" Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.357760 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl8jq\" (UniqueName: \"kubernetes.io/projected/cd546d8b-6b91-4fbe-91a0-b16532fc2759-kube-api-access-fl8jq\") pod \"auto-csr-approver-29547636-h7t9g\" (UID: \"cd546d8b-6b91-4fbe-91a0-b16532fc2759\") " pod="openshift-infra/auto-csr-approver-29547636-h7t9g" Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.439533 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-2rfwv" Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.439594 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-2rfwv" Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.459511 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547636-h7t9g" Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.503803 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rngx5"] Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.504245 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rngx5" podUID="92d8c131-7403-441a-8112-7dcd003edf5f" containerName="registry-server" containerID="cri-o://8a70e1565776e9deffcd78d8172afed1516ead9cf6efd334ed6cc17dc1fd463b" gracePeriod=2 Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.504411 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-2rfwv" Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.603579 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-2rfwv" Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.897758 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9cpft"] Mar 07 04:36:00 crc kubenswrapper[4689]: I0307 04:36:00.897993 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9cpft" podUID="dd67439f-628c-46f1-a216-273ee2d1bb0f" containerName="registry-server" containerID="cri-o://5bdfce13dd5ba8d71fe8ada28a6f267e74144e024f6d21818178901bae282253" gracePeriod=2 Mar 07 04:36:01 crc kubenswrapper[4689]: I0307 04:36:01.586079 4689 generic.go:334] "Generic (PLEG): container finished" podID="92d8c131-7403-441a-8112-7dcd003edf5f" containerID="8a70e1565776e9deffcd78d8172afed1516ead9cf6efd334ed6cc17dc1fd463b" exitCode=0 Mar 07 04:36:01 crc kubenswrapper[4689]: I0307 04:36:01.586521 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rngx5" event={"ID":"92d8c131-7403-441a-8112-7dcd003edf5f","Type":"ContainerDied","Data":"8a70e1565776e9deffcd78d8172afed1516ead9cf6efd334ed6cc17dc1fd463b"} Mar 07 04:36:01 crc kubenswrapper[4689]: I0307 04:36:01.588784 4689 generic.go:334] "Generic (PLEG): container finished" podID="dd67439f-628c-46f1-a216-273ee2d1bb0f" containerID="5bdfce13dd5ba8d71fe8ada28a6f267e74144e024f6d21818178901bae282253" exitCode=0 Mar 07 04:36:01 crc kubenswrapper[4689]: I0307 04:36:01.588852 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cpft" event={"ID":"dd67439f-628c-46f1-a216-273ee2d1bb0f","Type":"ContainerDied","Data":"5bdfce13dd5ba8d71fe8ada28a6f267e74144e024f6d21818178901bae282253"} Mar 07 04:36:03 crc kubenswrapper[4689]: I0307 04:36:03.855345 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:36:03 crc kubenswrapper[4689]: I0307 04:36:03.981532 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d8c131-7403-441a-8112-7dcd003edf5f-utilities\") pod \"92d8c131-7403-441a-8112-7dcd003edf5f\" (UID: \"92d8c131-7403-441a-8112-7dcd003edf5f\") " Mar 07 04:36:03 crc kubenswrapper[4689]: I0307 04:36:03.981843 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t464\" (UniqueName: \"kubernetes.io/projected/92d8c131-7403-441a-8112-7dcd003edf5f-kube-api-access-4t464\") pod \"92d8c131-7403-441a-8112-7dcd003edf5f\" (UID: \"92d8c131-7403-441a-8112-7dcd003edf5f\") " Mar 07 04:36:03 crc kubenswrapper[4689]: I0307 04:36:03.982027 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d8c131-7403-441a-8112-7dcd003edf5f-catalog-content\") pod \"92d8c131-7403-441a-8112-7dcd003edf5f\" (UID: \"92d8c131-7403-441a-8112-7dcd003edf5f\") " Mar 07 04:36:03 crc kubenswrapper[4689]: I0307 04:36:03.982429 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d8c131-7403-441a-8112-7dcd003edf5f-utilities" (OuterVolumeSpecName: "utilities") pod "92d8c131-7403-441a-8112-7dcd003edf5f" (UID: "92d8c131-7403-441a-8112-7dcd003edf5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:36:03 crc kubenswrapper[4689]: I0307 04:36:03.986029 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d8c131-7403-441a-8112-7dcd003edf5f-kube-api-access-4t464" (OuterVolumeSpecName: "kube-api-access-4t464") pod "92d8c131-7403-441a-8112-7dcd003edf5f" (UID: "92d8c131-7403-441a-8112-7dcd003edf5f"). InnerVolumeSpecName "kube-api-access-4t464". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:36:03 crc kubenswrapper[4689]: I0307 04:36:03.996772 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d8c131-7403-441a-8112-7dcd003edf5f-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:03 crc kubenswrapper[4689]: I0307 04:36:03.996796 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t464\" (UniqueName: \"kubernetes.io/projected/92d8c131-7403-441a-8112-7dcd003edf5f-kube-api-access-4t464\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.002189 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.042204 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d8c131-7403-441a-8112-7dcd003edf5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92d8c131-7403-441a-8112-7dcd003edf5f" (UID: "92d8c131-7403-441a-8112-7dcd003edf5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.097760 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr8rk\" (UniqueName: \"kubernetes.io/projected/dd67439f-628c-46f1-a216-273ee2d1bb0f-kube-api-access-rr8rk\") pod \"dd67439f-628c-46f1-a216-273ee2d1bb0f\" (UID: \"dd67439f-628c-46f1-a216-273ee2d1bb0f\") " Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.097894 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd67439f-628c-46f1-a216-273ee2d1bb0f-catalog-content\") pod \"dd67439f-628c-46f1-a216-273ee2d1bb0f\" (UID: \"dd67439f-628c-46f1-a216-273ee2d1bb0f\") " Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.098053 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd67439f-628c-46f1-a216-273ee2d1bb0f-utilities\") pod \"dd67439f-628c-46f1-a216-273ee2d1bb0f\" (UID: \"dd67439f-628c-46f1-a216-273ee2d1bb0f\") " Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.098570 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d8c131-7403-441a-8112-7dcd003edf5f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.098945 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd67439f-628c-46f1-a216-273ee2d1bb0f-utilities" (OuterVolumeSpecName: "utilities") pod "dd67439f-628c-46f1-a216-273ee2d1bb0f" (UID: "dd67439f-628c-46f1-a216-273ee2d1bb0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.101018 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd67439f-628c-46f1-a216-273ee2d1bb0f-kube-api-access-rr8rk" (OuterVolumeSpecName: "kube-api-access-rr8rk") pod "dd67439f-628c-46f1-a216-273ee2d1bb0f" (UID: "dd67439f-628c-46f1-a216-273ee2d1bb0f"). InnerVolumeSpecName "kube-api-access-rr8rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.152801 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd67439f-628c-46f1-a216-273ee2d1bb0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd67439f-628c-46f1-a216-273ee2d1bb0f" (UID: "dd67439f-628c-46f1-a216-273ee2d1bb0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.200251 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd67439f-628c-46f1-a216-273ee2d1bb0f-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.200290 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr8rk\" (UniqueName: \"kubernetes.io/projected/dd67439f-628c-46f1-a216-273ee2d1bb0f-kube-api-access-rr8rk\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.200304 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd67439f-628c-46f1-a216-273ee2d1bb0f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.213602 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547636-h7t9g"] Mar 07 04:36:04 crc kubenswrapper[4689]: W0307 04:36:04.217323 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd546d8b_6b91_4fbe_91a0_b16532fc2759.slice/crio-63936c42679dbc62022a249fadade6942938e6f884b39b43067a6a6f873486f9 WatchSource:0}: Error finding container 63936c42679dbc62022a249fadade6942938e6f884b39b43067a6a6f873486f9: Status 404 returned error can't find the container with id 63936c42679dbc62022a249fadade6942938e6f884b39b43067a6a6f873486f9 Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.609074 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rngx5" event={"ID":"92d8c131-7403-441a-8112-7dcd003edf5f","Type":"ContainerDied","Data":"8ef5f0d1ac05dc2311ccaa323b3369a441ad57c9726c15f334241c628231e871"} Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.609092 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rngx5" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.609408 4689 scope.go:117] "RemoveContainer" containerID="8a70e1565776e9deffcd78d8172afed1516ead9cf6efd334ed6cc17dc1fd463b" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.611553 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-vxm85" event={"ID":"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1","Type":"ContainerStarted","Data":"4ec70c5fbe89650f3d3a689ac9370df57fc44f31c6efc015c2f7f2d1d4648fbf"} Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.616498 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547636-h7t9g" event={"ID":"cd546d8b-6b91-4fbe-91a0-b16532fc2759","Type":"ContainerStarted","Data":"63936c42679dbc62022a249fadade6942938e6f884b39b43067a6a6f873486f9"} Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.621667 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cpft" event={"ID":"dd67439f-628c-46f1-a216-273ee2d1bb0f","Type":"ContainerDied","Data":"248a00c87aeb690d2ddf485360dea1a944c65aff22a21e1311800a4389aaf278"} Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.621692 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cpft" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.626585 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsggs" event={"ID":"f5158f39-08c1-467c-a67f-360dd799f42f","Type":"ContainerStarted","Data":"f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761"} Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.631557 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-sync-vxm85" podStartSLOduration=1.812761014 podStartE2EDuration="9.631537615s" podCreationTimestamp="2026-03-07 04:35:55 +0000 UTC" firstStartedPulling="2026-03-07 04:35:56.040298138 +0000 UTC m=+1001.086681627" lastFinishedPulling="2026-03-07 04:36:03.859074739 +0000 UTC m=+1008.905458228" observedRunningTime="2026-03-07 04:36:04.630049555 +0000 UTC m=+1009.676433044" watchObservedRunningTime="2026-03-07 04:36:04.631537615 +0000 UTC m=+1009.677921104" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.649670 4689 scope.go:117] "RemoveContainer" containerID="3cd7bb69a72a2c3b495a3e3991ebca378ef1f179aa3ccade7ec2a266b759a75e" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.656064 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vsggs" podStartSLOduration=2.329312581 podStartE2EDuration="8.655625994s" podCreationTimestamp="2026-03-07 04:35:56 +0000 UTC" firstStartedPulling="2026-03-07 04:35:57.532701145 +0000 UTC m=+1002.579084634" lastFinishedPulling="2026-03-07 04:36:03.859014548 +0000 UTC m=+1008.905398047" observedRunningTime="2026-03-07 04:36:04.647549817 +0000 UTC m=+1009.693933306" watchObservedRunningTime="2026-03-07 04:36:04.655625994 +0000 UTC m=+1009.702009483" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.669792 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9cpft"] Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.678100 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9cpft"] Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.680426 4689 scope.go:117] "RemoveContainer" containerID="df3b93ae3552506ea438d14336fc029052e4d8bdd7116a7e6a768e5406f93965" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.691247 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rngx5"] Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.708978 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rngx5"] Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.710492 4689 scope.go:117] "RemoveContainer" containerID="5bdfce13dd5ba8d71fe8ada28a6f267e74144e024f6d21818178901bae282253" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.738046 4689 scope.go:117] "RemoveContainer" containerID="f674463bea7de68a169573ff355b679ceddd54cb82aebd0e1eea58467bdc64b9" Mar 07 04:36:04 crc kubenswrapper[4689]: I0307 04:36:04.775198 4689 scope.go:117] "RemoveContainer" containerID="adee63e3bb91bb5e073db34e85558be841665c13527e1368356ac5f438ed483b" Mar 07 04:36:05 crc kubenswrapper[4689]: I0307 04:36:05.234874 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-5pqgx" Mar 07 04:36:05 crc kubenswrapper[4689]: I0307 04:36:05.236507 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-5pqgx" Mar 07 04:36:05 crc kubenswrapper[4689]: I0307 04:36:05.289081 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-5pqgx" Mar 07 04:36:05 crc kubenswrapper[4689]: I0307 04:36:05.638480 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547636-h7t9g" event={"ID":"cd546d8b-6b91-4fbe-91a0-b16532fc2759","Type":"ContainerStarted","Data":"3f4f0fed307167477eefb030f2243db52d2ea03369ae3e197f195f92174acb2f"} Mar 07 04:36:05 crc kubenswrapper[4689]: I0307 04:36:05.661733 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547636-h7t9g" podStartSLOduration=4.590346377 podStartE2EDuration="5.661711829s" podCreationTimestamp="2026-03-07 04:36:00 +0000 UTC" firstStartedPulling="2026-03-07 04:36:04.21975161 +0000 UTC m=+1009.266135099" lastFinishedPulling="2026-03-07 04:36:05.291117052 +0000 UTC m=+1010.337500551" observedRunningTime="2026-03-07 04:36:05.653913119 +0000 UTC m=+1010.700296618" watchObservedRunningTime="2026-03-07 04:36:05.661711829 +0000 UTC m=+1010.708095328" Mar 07 04:36:05 crc kubenswrapper[4689]: I0307 04:36:05.686732 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-5pqgx" Mar 07 04:36:05 crc kubenswrapper[4689]: I0307 04:36:05.836608 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d8c131-7403-441a-8112-7dcd003edf5f" path="/var/lib/kubelet/pods/92d8c131-7403-441a-8112-7dcd003edf5f/volumes" Mar 07 04:36:05 crc kubenswrapper[4689]: I0307 04:36:05.838266 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd67439f-628c-46f1-a216-273ee2d1bb0f" path="/var/lib/kubelet/pods/dd67439f-628c-46f1-a216-273ee2d1bb0f/volumes" Mar 07 04:36:06 crc kubenswrapper[4689]: I0307 04:36:06.647674 4689 generic.go:334] "Generic (PLEG): container finished" podID="cd546d8b-6b91-4fbe-91a0-b16532fc2759" containerID="3f4f0fed307167477eefb030f2243db52d2ea03369ae3e197f195f92174acb2f" exitCode=0 Mar 07 04:36:06 crc kubenswrapper[4689]: I0307 04:36:06.647710 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547636-h7t9g" event={"ID":"cd546d8b-6b91-4fbe-91a0-b16532fc2759","Type":"ContainerDied","Data":"3f4f0fed307167477eefb030f2243db52d2ea03369ae3e197f195f92174acb2f"} Mar 07 04:36:06 crc kubenswrapper[4689]: I0307 04:36:06.650550 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:36:06 crc kubenswrapper[4689]: I0307 04:36:06.650590 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:36:07 crc kubenswrapper[4689]: I0307 04:36:07.655897 4689 generic.go:334] "Generic (PLEG): container finished" podID="f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1" containerID="4ec70c5fbe89650f3d3a689ac9370df57fc44f31c6efc015c2f7f2d1d4648fbf" exitCode=0 Mar 07 04:36:07 crc kubenswrapper[4689]: I0307 04:36:07.655972 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-vxm85" event={"ID":"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1","Type":"ContainerDied","Data":"4ec70c5fbe89650f3d3a689ac9370df57fc44f31c6efc015c2f7f2d1d4648fbf"} Mar 07 04:36:07 crc kubenswrapper[4689]: I0307 04:36:07.700728 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vsggs" podUID="f5158f39-08c1-467c-a67f-360dd799f42f" containerName="registry-server" probeResult="failure" output=< Mar 07 04:36:07 crc kubenswrapper[4689]: timeout: failed to connect service ":50051" within 1s Mar 07 04:36:07 crc kubenswrapper[4689]: > Mar 07 04:36:07 crc kubenswrapper[4689]: I0307 04:36:07.990428 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547636-h7t9g" Mar 07 04:36:08 crc kubenswrapper[4689]: I0307 04:36:08.157516 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl8jq\" (UniqueName: \"kubernetes.io/projected/cd546d8b-6b91-4fbe-91a0-b16532fc2759-kube-api-access-fl8jq\") pod \"cd546d8b-6b91-4fbe-91a0-b16532fc2759\" (UID: \"cd546d8b-6b91-4fbe-91a0-b16532fc2759\") " Mar 07 04:36:08 crc kubenswrapper[4689]: I0307 04:36:08.164374 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd546d8b-6b91-4fbe-91a0-b16532fc2759-kube-api-access-fl8jq" (OuterVolumeSpecName: "kube-api-access-fl8jq") pod "cd546d8b-6b91-4fbe-91a0-b16532fc2759" (UID: "cd546d8b-6b91-4fbe-91a0-b16532fc2759"). InnerVolumeSpecName "kube-api-access-fl8jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:36:08 crc kubenswrapper[4689]: I0307 04:36:08.262520 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl8jq\" (UniqueName: \"kubernetes.io/projected/cd546d8b-6b91-4fbe-91a0-b16532fc2759-kube-api-access-fl8jq\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:08 crc kubenswrapper[4689]: I0307 04:36:08.669779 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547636-h7t9g" Mar 07 04:36:08 crc kubenswrapper[4689]: I0307 04:36:08.669772 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547636-h7t9g" event={"ID":"cd546d8b-6b91-4fbe-91a0-b16532fc2759","Type":"ContainerDied","Data":"63936c42679dbc62022a249fadade6942938e6f884b39b43067a6a6f873486f9"} Mar 07 04:36:08 crc kubenswrapper[4689]: I0307 04:36:08.669836 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63936c42679dbc62022a249fadade6942938e6f884b39b43067a6a6f873486f9" Mar 07 04:36:08 crc kubenswrapper[4689]: I0307 04:36:08.739677 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547630-wl6lv"] Mar 07 04:36:08 crc kubenswrapper[4689]: I0307 04:36:08.745632 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547630-wl6lv"] Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.023225 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-vxm85" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.183022 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlfqv\" (UniqueName: \"kubernetes.io/projected/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1-kube-api-access-qlfqv\") pod \"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1\" (UID: \"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1\") " Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.183079 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1-config-data\") pod \"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1\" (UID: \"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1\") " Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.199700 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1-kube-api-access-qlfqv" (OuterVolumeSpecName: "kube-api-access-qlfqv") pod "f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1" (UID: "f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1"). InnerVolumeSpecName "kube-api-access-qlfqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.218936 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1-config-data" (OuterVolumeSpecName: "config-data") pod "f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1" (UID: "f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.284631 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlfqv\" (UniqueName: \"kubernetes.io/projected/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1-kube-api-access-qlfqv\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.284664 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.678086 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-vxm85" event={"ID":"f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1","Type":"ContainerDied","Data":"fafcd01de1997e99cc80c5f5e330c2494174ca418f73de6150563f9650c22050"} Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.678434 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fafcd01de1997e99cc80c5f5e330c2494174ca418f73de6150563f9650c22050" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.678147 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-vxm85" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.839639 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc541d1-e031-4e42-a304-66a08cb905b1" path="/var/lib/kubelet/pods/2dc541d1-e031-4e42-a304-66a08cb905b1/volumes" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.885295 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-s5fgw"] Mar 07 04:36:09 crc kubenswrapper[4689]: E0307 04:36:09.885747 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd546d8b-6b91-4fbe-91a0-b16532fc2759" containerName="oc" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.885784 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd546d8b-6b91-4fbe-91a0-b16532fc2759" containerName="oc" Mar 07 04:36:09 crc kubenswrapper[4689]: E0307 04:36:09.885810 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d8c131-7403-441a-8112-7dcd003edf5f" containerName="registry-server" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.885827 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d8c131-7403-441a-8112-7dcd003edf5f" containerName="registry-server" Mar 07 04:36:09 crc kubenswrapper[4689]: E0307 04:36:09.885855 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd67439f-628c-46f1-a216-273ee2d1bb0f" containerName="extract-content" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.885871 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd67439f-628c-46f1-a216-273ee2d1bb0f" containerName="extract-content" Mar 07 04:36:09 crc kubenswrapper[4689]: E0307 04:36:09.885894 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d8c131-7403-441a-8112-7dcd003edf5f" containerName="extract-content" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.885912 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d8c131-7403-441a-8112-7dcd003edf5f" containerName="extract-content" Mar 07 04:36:09 crc kubenswrapper[4689]: E0307 04:36:09.885929 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1" containerName="keystone-db-sync" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.886666 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1" containerName="keystone-db-sync" Mar 07 04:36:09 crc kubenswrapper[4689]: E0307 04:36:09.886709 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d8c131-7403-441a-8112-7dcd003edf5f" containerName="extract-utilities" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.886727 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d8c131-7403-441a-8112-7dcd003edf5f" containerName="extract-utilities" Mar 07 04:36:09 crc kubenswrapper[4689]: E0307 04:36:09.886761 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd67439f-628c-46f1-a216-273ee2d1bb0f" containerName="registry-server" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.886776 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd67439f-628c-46f1-a216-273ee2d1bb0f" containerName="registry-server" Mar 07 04:36:09 crc kubenswrapper[4689]: E0307 04:36:09.886796 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd67439f-628c-46f1-a216-273ee2d1bb0f" containerName="extract-utilities" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.886813 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd67439f-628c-46f1-a216-273ee2d1bb0f" containerName="extract-utilities" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.887086 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd67439f-628c-46f1-a216-273ee2d1bb0f" containerName="registry-server" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.887130 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd546d8b-6b91-4fbe-91a0-b16532fc2759" containerName="oc" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.887161 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d8c131-7403-441a-8112-7dcd003edf5f" containerName="registry-server" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.887222 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1" containerName="keystone-db-sync" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.888146 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.891056 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.891395 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-jz6fk" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.891501 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"osp-secret" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.891662 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.891939 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.898699 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-s5fgw"] Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.992993 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b25km\" (UniqueName: \"kubernetes.io/projected/2c1087ab-41b1-4140-87b2-96191dfe1928-kube-api-access-b25km\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.993048 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-fernet-keys\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.993069 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-config-data\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.993317 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-scripts\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:09 crc kubenswrapper[4689]: I0307 04:36:09.993372 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-credential-keys\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.094983 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25km\" (UniqueName: \"kubernetes.io/projected/2c1087ab-41b1-4140-87b2-96191dfe1928-kube-api-access-b25km\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.095052 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-fernet-keys\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.095078 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-config-data\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.095146 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-scripts\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.095193 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-credential-keys\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.099600 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-scripts\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.100464 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-fernet-keys\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.101643 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-credential-keys\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.102211 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-config-data\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.114097 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b25km\" (UniqueName: \"kubernetes.io/projected/2c1087ab-41b1-4140-87b2-96191dfe1928-kube-api-access-b25km\") pod \"keystone-bootstrap-s5fgw\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.213381 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.481618 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-s5fgw"] Mar 07 04:36:10 crc kubenswrapper[4689]: I0307 04:36:10.686236 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" event={"ID":"2c1087ab-41b1-4140-87b2-96191dfe1928","Type":"ContainerStarted","Data":"e6f03db3ec2c36c0e4e9f926e5e618a2ab2e5f4accae8bdfca99f690683d1001"} Mar 07 04:36:11 crc kubenswrapper[4689]: I0307 04:36:11.703726 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" event={"ID":"2c1087ab-41b1-4140-87b2-96191dfe1928","Type":"ContainerStarted","Data":"91a644c2b79c56d56175895a1ae32eaf0c816e2a1de75c928b161c4dd0001e47"} Mar 07 04:36:11 crc kubenswrapper[4689]: I0307 04:36:11.729366 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" podStartSLOduration=2.729344098 podStartE2EDuration="2.729344098s" podCreationTimestamp="2026-03-07 04:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:36:11.726056789 +0000 UTC m=+1016.772440298" watchObservedRunningTime="2026-03-07 04:36:11.729344098 +0000 UTC m=+1016.775727597" Mar 07 04:36:11 crc kubenswrapper[4689]: I0307 04:36:11.756522 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk"] Mar 07 04:36:11 crc kubenswrapper[4689]: I0307 04:36:11.757793 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:11 crc kubenswrapper[4689]: I0307 04:36:11.759900 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4j8gt" Mar 07 04:36:11 crc kubenswrapper[4689]: I0307 04:36:11.773666 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk"] Mar 07 04:36:11 crc kubenswrapper[4689]: I0307 04:36:11.933717 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/baf61d7b-9301-4e93-ba1f-60d19c9497d2-util\") pod \"a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk\" (UID: \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\") " pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:11 crc kubenswrapper[4689]: I0307 04:36:11.933782 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/baf61d7b-9301-4e93-ba1f-60d19c9497d2-bundle\") pod \"a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk\" (UID: \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\") " pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:11 crc kubenswrapper[4689]: I0307 04:36:11.933811 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx64d\" (UniqueName: \"kubernetes.io/projected/baf61d7b-9301-4e93-ba1f-60d19c9497d2-kube-api-access-gx64d\") pod \"a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk\" (UID: \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\") " pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.035561 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/baf61d7b-9301-4e93-ba1f-60d19c9497d2-util\") pod \"a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk\" (UID: \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\") " pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.035631 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/baf61d7b-9301-4e93-ba1f-60d19c9497d2-bundle\") pod \"a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk\" (UID: \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\") " pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.035664 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx64d\" (UniqueName: \"kubernetes.io/projected/baf61d7b-9301-4e93-ba1f-60d19c9497d2-kube-api-access-gx64d\") pod \"a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk\" (UID: \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\") " pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.036368 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/baf61d7b-9301-4e93-ba1f-60d19c9497d2-util\") pod \"a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk\" (UID: \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\") " pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.036550 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/baf61d7b-9301-4e93-ba1f-60d19c9497d2-bundle\") pod \"a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk\" (UID: \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\") " pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.070156 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx64d\" (UniqueName: \"kubernetes.io/projected/baf61d7b-9301-4e93-ba1f-60d19c9497d2-kube-api-access-gx64d\") pod \"a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk\" (UID: \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\") " pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.112533 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.593782 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk"] Mar 07 04:36:12 crc kubenswrapper[4689]: W0307 04:36:12.602676 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaf61d7b_9301_4e93_ba1f_60d19c9497d2.slice/crio-b7614c8176193f77b39168c850a206f0d2f59b9d14e9ccd5cf99d4d90d10df64 WatchSource:0}: Error finding container b7614c8176193f77b39168c850a206f0d2f59b9d14e9ccd5cf99d4d90d10df64: Status 404 returned error can't find the container with id b7614c8176193f77b39168c850a206f0d2f59b9d14e9ccd5cf99d4d90d10df64 Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.715474 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" event={"ID":"baf61d7b-9301-4e93-ba1f-60d19c9497d2","Type":"ContainerStarted","Data":"b7614c8176193f77b39168c850a206f0d2f59b9d14e9ccd5cf99d4d90d10df64"} Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.798207 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft"] Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.799579 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.821983 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft"] Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.954305 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69a1d244-2c54-45fa-af37-993bb70ec9ed-util\") pod \"9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft\" (UID: \"69a1d244-2c54-45fa-af37-993bb70ec9ed\") " pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.954372 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69a1d244-2c54-45fa-af37-993bb70ec9ed-bundle\") pod \"9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft\" (UID: \"69a1d244-2c54-45fa-af37-993bb70ec9ed\") " pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:12 crc kubenswrapper[4689]: I0307 04:36:12.954402 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67t5s\" (UniqueName: \"kubernetes.io/projected/69a1d244-2c54-45fa-af37-993bb70ec9ed-kube-api-access-67t5s\") pod \"9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft\" (UID: \"69a1d244-2c54-45fa-af37-993bb70ec9ed\") " pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:13 crc kubenswrapper[4689]: I0307 04:36:13.055782 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67t5s\" (UniqueName: \"kubernetes.io/projected/69a1d244-2c54-45fa-af37-993bb70ec9ed-kube-api-access-67t5s\") pod \"9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft\" (UID: \"69a1d244-2c54-45fa-af37-993bb70ec9ed\") " pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:13 crc kubenswrapper[4689]: I0307 04:36:13.055925 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69a1d244-2c54-45fa-af37-993bb70ec9ed-util\") pod \"9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft\" (UID: \"69a1d244-2c54-45fa-af37-993bb70ec9ed\") " pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:13 crc kubenswrapper[4689]: I0307 04:36:13.056450 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69a1d244-2c54-45fa-af37-993bb70ec9ed-util\") pod \"9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft\" (UID: \"69a1d244-2c54-45fa-af37-993bb70ec9ed\") " pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:13 crc kubenswrapper[4689]: I0307 04:36:13.056528 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69a1d244-2c54-45fa-af37-993bb70ec9ed-bundle\") pod \"9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft\" (UID: \"69a1d244-2c54-45fa-af37-993bb70ec9ed\") " pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:13 crc kubenswrapper[4689]: I0307 04:36:13.056816 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69a1d244-2c54-45fa-af37-993bb70ec9ed-bundle\") pod \"9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft\" (UID: \"69a1d244-2c54-45fa-af37-993bb70ec9ed\") " pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:13 crc kubenswrapper[4689]: I0307 04:36:13.073681 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67t5s\" (UniqueName: \"kubernetes.io/projected/69a1d244-2c54-45fa-af37-993bb70ec9ed-kube-api-access-67t5s\") pod \"9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft\" (UID: \"69a1d244-2c54-45fa-af37-993bb70ec9ed\") " pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:13 crc kubenswrapper[4689]: I0307 04:36:13.119789 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:13 crc kubenswrapper[4689]: I0307 04:36:13.591897 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft"] Mar 07 04:36:13 crc kubenswrapper[4689]: W0307 04:36:13.595273 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a1d244_2c54_45fa_af37_993bb70ec9ed.slice/crio-fbad84193ba965b5ecaa1f1a4784309632ef3322b33faef7ae5d58eeff8692a2 WatchSource:0}: Error finding container fbad84193ba965b5ecaa1f1a4784309632ef3322b33faef7ae5d58eeff8692a2: Status 404 returned error can't find the container with id fbad84193ba965b5ecaa1f1a4784309632ef3322b33faef7ae5d58eeff8692a2 Mar 07 04:36:13 crc kubenswrapper[4689]: I0307 04:36:13.725341 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" event={"ID":"69a1d244-2c54-45fa-af37-993bb70ec9ed","Type":"ContainerStarted","Data":"fbad84193ba965b5ecaa1f1a4784309632ef3322b33faef7ae5d58eeff8692a2"} Mar 07 04:36:13 crc kubenswrapper[4689]: I0307 04:36:13.731635 4689 generic.go:334] "Generic (PLEG): container finished" podID="baf61d7b-9301-4e93-ba1f-60d19c9497d2" containerID="3a0e105cfb4023323118eb9bfa2b2fa4d249bd155f30966d7e3dd29952a2184e" exitCode=0 Mar 07 04:36:13 crc kubenswrapper[4689]: I0307 04:36:13.731716 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" event={"ID":"baf61d7b-9301-4e93-ba1f-60d19c9497d2","Type":"ContainerDied","Data":"3a0e105cfb4023323118eb9bfa2b2fa4d249bd155f30966d7e3dd29952a2184e"} Mar 07 04:36:14 crc kubenswrapper[4689]: I0307 04:36:14.750372 4689 generic.go:334] "Generic (PLEG): container finished" podID="69a1d244-2c54-45fa-af37-993bb70ec9ed" containerID="1553062b35e6fce92d9ceca1f370c71e00fe127c25f80f1a4887b6403e3a193c" exitCode=0 Mar 07 04:36:14 crc kubenswrapper[4689]: I0307 04:36:14.750431 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" event={"ID":"69a1d244-2c54-45fa-af37-993bb70ec9ed","Type":"ContainerDied","Data":"1553062b35e6fce92d9ceca1f370c71e00fe127c25f80f1a4887b6403e3a193c"} Mar 07 04:36:14 crc kubenswrapper[4689]: I0307 04:36:14.753761 4689 generic.go:334] "Generic (PLEG): container finished" podID="2c1087ab-41b1-4140-87b2-96191dfe1928" containerID="91a644c2b79c56d56175895a1ae32eaf0c816e2a1de75c928b161c4dd0001e47" exitCode=0 Mar 07 04:36:14 crc kubenswrapper[4689]: I0307 04:36:14.753848 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" event={"ID":"2c1087ab-41b1-4140-87b2-96191dfe1928","Type":"ContainerDied","Data":"91a644c2b79c56d56175895a1ae32eaf0c816e2a1de75c928b161c4dd0001e47"} Mar 07 04:36:15 crc kubenswrapper[4689]: I0307 04:36:15.762456 4689 generic.go:334] "Generic (PLEG): container finished" podID="baf61d7b-9301-4e93-ba1f-60d19c9497d2" containerID="b320209abcec8a7bf7c652ca0acfb98e46fd1e887390d0cb42ec7295c74fb951" exitCode=0 Mar 07 04:36:15 crc kubenswrapper[4689]: I0307 04:36:15.762533 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" event={"ID":"baf61d7b-9301-4e93-ba1f-60d19c9497d2","Type":"ContainerDied","Data":"b320209abcec8a7bf7c652ca0acfb98e46fd1e887390d0cb42ec7295c74fb951"} Mar 07 04:36:15 crc kubenswrapper[4689]: I0307 04:36:15.764560 4689 generic.go:334] "Generic (PLEG): container finished" podID="69a1d244-2c54-45fa-af37-993bb70ec9ed" containerID="448ee8c5bed33702fd3fa184542ce564afd4bf7c755e19d80750618f6cec259e" exitCode=0 Mar 07 04:36:15 crc kubenswrapper[4689]: I0307 04:36:15.764688 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" event={"ID":"69a1d244-2c54-45fa-af37-993bb70ec9ed","Type":"ContainerDied","Data":"448ee8c5bed33702fd3fa184542ce564afd4bf7c755e19d80750618f6cec259e"} Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.245589 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.303575 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-fernet-keys\") pod \"2c1087ab-41b1-4140-87b2-96191dfe1928\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.303648 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b25km\" (UniqueName: \"kubernetes.io/projected/2c1087ab-41b1-4140-87b2-96191dfe1928-kube-api-access-b25km\") pod \"2c1087ab-41b1-4140-87b2-96191dfe1928\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.303740 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-credential-keys\") pod \"2c1087ab-41b1-4140-87b2-96191dfe1928\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.303810 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-scripts\") pod \"2c1087ab-41b1-4140-87b2-96191dfe1928\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.303848 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-config-data\") pod \"2c1087ab-41b1-4140-87b2-96191dfe1928\" (UID: \"2c1087ab-41b1-4140-87b2-96191dfe1928\") " Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.309466 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c1087ab-41b1-4140-87b2-96191dfe1928-kube-api-access-b25km" (OuterVolumeSpecName: "kube-api-access-b25km") pod "2c1087ab-41b1-4140-87b2-96191dfe1928" (UID: "2c1087ab-41b1-4140-87b2-96191dfe1928"). InnerVolumeSpecName "kube-api-access-b25km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.309591 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-scripts" (OuterVolumeSpecName: "scripts") pod "2c1087ab-41b1-4140-87b2-96191dfe1928" (UID: "2c1087ab-41b1-4140-87b2-96191dfe1928"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.314246 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2c1087ab-41b1-4140-87b2-96191dfe1928" (UID: "2c1087ab-41b1-4140-87b2-96191dfe1928"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.331039 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2c1087ab-41b1-4140-87b2-96191dfe1928" (UID: "2c1087ab-41b1-4140-87b2-96191dfe1928"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.331189 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-config-data" (OuterVolumeSpecName: "config-data") pod "2c1087ab-41b1-4140-87b2-96191dfe1928" (UID: "2c1087ab-41b1-4140-87b2-96191dfe1928"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.405596 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.405637 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.405651 4689 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.405666 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b25km\" (UniqueName: \"kubernetes.io/projected/2c1087ab-41b1-4140-87b2-96191dfe1928-kube-api-access-b25km\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.405693 4689 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c1087ab-41b1-4140-87b2-96191dfe1928-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.694854 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.734738 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.774626 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" event={"ID":"2c1087ab-41b1-4140-87b2-96191dfe1928","Type":"ContainerDied","Data":"e6f03db3ec2c36c0e4e9f926e5e618a2ab2e5f4accae8bdfca99f690683d1001"} Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.774681 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6f03db3ec2c36c0e4e9f926e5e618a2ab2e5f4accae8bdfca99f690683d1001" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.774613 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-s5fgw" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.776584 4689 generic.go:334] "Generic (PLEG): container finished" podID="baf61d7b-9301-4e93-ba1f-60d19c9497d2" containerID="511d560565248d5269c105e78756e9566c822b23ba9157ec1482a0a87b287780" exitCode=0 Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.776646 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" event={"ID":"baf61d7b-9301-4e93-ba1f-60d19c9497d2","Type":"ContainerDied","Data":"511d560565248d5269c105e78756e9566c822b23ba9157ec1482a0a87b287780"} Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.778699 4689 generic.go:334] "Generic (PLEG): container finished" podID="69a1d244-2c54-45fa-af37-993bb70ec9ed" containerID="beedd200283e3ddcf5f321d56d2129a80e698420f1bfbff4e065fa4516a71d84" exitCode=0 Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.778791 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" event={"ID":"69a1d244-2c54-45fa-af37-993bb70ec9ed","Type":"ContainerDied","Data":"beedd200283e3ddcf5f321d56d2129a80e698420f1bfbff4e065fa4516a71d84"} Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.991157 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-69f7dd67f9-5tpdd"] Mar 07 04:36:16 crc kubenswrapper[4689]: E0307 04:36:16.991779 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c1087ab-41b1-4140-87b2-96191dfe1928" containerName="keystone-bootstrap" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.991803 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c1087ab-41b1-4140-87b2-96191dfe1928" containerName="keystone-bootstrap" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.991962 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c1087ab-41b1-4140-87b2-96191dfe1928" containerName="keystone-bootstrap" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.992502 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.996427 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-jz6fk" Mar 07 04:36:16 crc kubenswrapper[4689]: I0307 04:36:16.996680 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.003274 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-69f7dd67f9-5tpdd"] Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.006379 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.006589 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.115644 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-config-data\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.115720 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwrsm\" (UniqueName: \"kubernetes.io/projected/01e7640b-0391-468f-b8d7-8d0078e52e5f-kube-api-access-qwrsm\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.115825 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-fernet-keys\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.115953 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-credential-keys\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.116085 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-scripts\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.217422 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwrsm\" (UniqueName: \"kubernetes.io/projected/01e7640b-0391-468f-b8d7-8d0078e52e5f-kube-api-access-qwrsm\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.217502 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-fernet-keys\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.217528 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-credential-keys\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.217570 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-scripts\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.217619 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-config-data\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.222636 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-scripts\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.223484 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-config-data\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.224802 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-fernet-keys\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.232220 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-credential-keys\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.243786 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwrsm\" (UniqueName: \"kubernetes.io/projected/01e7640b-0391-468f-b8d7-8d0078e52e5f-kube-api-access-qwrsm\") pod \"keystone-69f7dd67f9-5tpdd\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.355797 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.753242 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-69f7dd67f9-5tpdd"] Mar 07 04:36:17 crc kubenswrapper[4689]: W0307 04:36:17.756131 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01e7640b_0391_468f_b8d7_8d0078e52e5f.slice/crio-892cbbc83d0b091f4ac373a3a284acaa066b14d3f1dda0b69050cfd1ed6a34e5 WatchSource:0}: Error finding container 892cbbc83d0b091f4ac373a3a284acaa066b14d3f1dda0b69050cfd1ed6a34e5: Status 404 returned error can't find the container with id 892cbbc83d0b091f4ac373a3a284acaa066b14d3f1dda0b69050cfd1ed6a34e5 Mar 07 04:36:17 crc kubenswrapper[4689]: I0307 04:36:17.786896 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" event={"ID":"01e7640b-0391-468f-b8d7-8d0078e52e5f","Type":"ContainerStarted","Data":"892cbbc83d0b091f4ac373a3a284acaa066b14d3f1dda0b69050cfd1ed6a34e5"} Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.121012 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.137302 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.233554 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67t5s\" (UniqueName: \"kubernetes.io/projected/69a1d244-2c54-45fa-af37-993bb70ec9ed-kube-api-access-67t5s\") pod \"69a1d244-2c54-45fa-af37-993bb70ec9ed\" (UID: \"69a1d244-2c54-45fa-af37-993bb70ec9ed\") " Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.233619 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69a1d244-2c54-45fa-af37-993bb70ec9ed-util\") pod \"69a1d244-2c54-45fa-af37-993bb70ec9ed\" (UID: \"69a1d244-2c54-45fa-af37-993bb70ec9ed\") " Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.233679 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69a1d244-2c54-45fa-af37-993bb70ec9ed-bundle\") pod \"69a1d244-2c54-45fa-af37-993bb70ec9ed\" (UID: \"69a1d244-2c54-45fa-af37-993bb70ec9ed\") " Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.233731 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx64d\" (UniqueName: \"kubernetes.io/projected/baf61d7b-9301-4e93-ba1f-60d19c9497d2-kube-api-access-gx64d\") pod \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\" (UID: \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\") " Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.233762 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/baf61d7b-9301-4e93-ba1f-60d19c9497d2-util\") pod \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\" (UID: \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\") " Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.233776 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/baf61d7b-9301-4e93-ba1f-60d19c9497d2-bundle\") pod \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\" (UID: \"baf61d7b-9301-4e93-ba1f-60d19c9497d2\") " Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.234545 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baf61d7b-9301-4e93-ba1f-60d19c9497d2-bundle" (OuterVolumeSpecName: "bundle") pod "baf61d7b-9301-4e93-ba1f-60d19c9497d2" (UID: "baf61d7b-9301-4e93-ba1f-60d19c9497d2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.237581 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf61d7b-9301-4e93-ba1f-60d19c9497d2-kube-api-access-gx64d" (OuterVolumeSpecName: "kube-api-access-gx64d") pod "baf61d7b-9301-4e93-ba1f-60d19c9497d2" (UID: "baf61d7b-9301-4e93-ba1f-60d19c9497d2"). InnerVolumeSpecName "kube-api-access-gx64d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.240194 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a1d244-2c54-45fa-af37-993bb70ec9ed-kube-api-access-67t5s" (OuterVolumeSpecName: "kube-api-access-67t5s") pod "69a1d244-2c54-45fa-af37-993bb70ec9ed" (UID: "69a1d244-2c54-45fa-af37-993bb70ec9ed"). InnerVolumeSpecName "kube-api-access-67t5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.249518 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a1d244-2c54-45fa-af37-993bb70ec9ed-bundle" (OuterVolumeSpecName: "bundle") pod "69a1d244-2c54-45fa-af37-993bb70ec9ed" (UID: "69a1d244-2c54-45fa-af37-993bb70ec9ed"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.258842 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baf61d7b-9301-4e93-ba1f-60d19c9497d2-util" (OuterVolumeSpecName: "util") pod "baf61d7b-9301-4e93-ba1f-60d19c9497d2" (UID: "baf61d7b-9301-4e93-ba1f-60d19c9497d2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.260118 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a1d244-2c54-45fa-af37-993bb70ec9ed-util" (OuterVolumeSpecName: "util") pod "69a1d244-2c54-45fa-af37-993bb70ec9ed" (UID: "69a1d244-2c54-45fa-af37-993bb70ec9ed"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.335545 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx64d\" (UniqueName: \"kubernetes.io/projected/baf61d7b-9301-4e93-ba1f-60d19c9497d2-kube-api-access-gx64d\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.335579 4689 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/baf61d7b-9301-4e93-ba1f-60d19c9497d2-util\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.335617 4689 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/baf61d7b-9301-4e93-ba1f-60d19c9497d2-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.335625 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67t5s\" (UniqueName: \"kubernetes.io/projected/69a1d244-2c54-45fa-af37-993bb70ec9ed-kube-api-access-67t5s\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.335634 4689 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69a1d244-2c54-45fa-af37-993bb70ec9ed-util\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.335644 4689 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69a1d244-2c54-45fa-af37-993bb70ec9ed-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.797054 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" event={"ID":"01e7640b-0391-468f-b8d7-8d0078e52e5f","Type":"ContainerStarted","Data":"87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86"} Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.802391 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" event={"ID":"baf61d7b-9301-4e93-ba1f-60d19c9497d2","Type":"ContainerDied","Data":"b7614c8176193f77b39168c850a206f0d2f59b9d14e9ccd5cf99d4d90d10df64"} Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.802445 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7614c8176193f77b39168c850a206f0d2f59b9d14e9ccd5cf99d4d90d10df64" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.802521 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.806890 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" event={"ID":"69a1d244-2c54-45fa-af37-993bb70ec9ed","Type":"ContainerDied","Data":"fbad84193ba965b5ecaa1f1a4784309632ef3322b33faef7ae5d58eeff8692a2"} Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.806930 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbad84193ba965b5ecaa1f1a4784309632ef3322b33faef7ae5d58eeff8692a2" Mar 07 04:36:18 crc kubenswrapper[4689]: I0307 04:36:18.806997 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft" Mar 07 04:36:19 crc kubenswrapper[4689]: I0307 04:36:19.161403 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" podStartSLOduration=3.161373368 podStartE2EDuration="3.161373368s" podCreationTimestamp="2026-03-07 04:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:36:18.826533763 +0000 UTC m=+1023.872917292" watchObservedRunningTime="2026-03-07 04:36:19.161373368 +0000 UTC m=+1024.207756877" Mar 07 04:36:19 crc kubenswrapper[4689]: I0307 04:36:19.814146 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.298147 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vsggs"] Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.299895 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vsggs" podUID="f5158f39-08c1-467c-a67f-360dd799f42f" containerName="registry-server" containerID="cri-o://f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761" gracePeriod=2 Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.714159 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.745468 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5158f39-08c1-467c-a67f-360dd799f42f-utilities\") pod \"f5158f39-08c1-467c-a67f-360dd799f42f\" (UID: \"f5158f39-08c1-467c-a67f-360dd799f42f\") " Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.745542 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbkn7\" (UniqueName: \"kubernetes.io/projected/f5158f39-08c1-467c-a67f-360dd799f42f-kube-api-access-bbkn7\") pod \"f5158f39-08c1-467c-a67f-360dd799f42f\" (UID: \"f5158f39-08c1-467c-a67f-360dd799f42f\") " Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.745606 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5158f39-08c1-467c-a67f-360dd799f42f-catalog-content\") pod \"f5158f39-08c1-467c-a67f-360dd799f42f\" (UID: \"f5158f39-08c1-467c-a67f-360dd799f42f\") " Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.746156 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5158f39-08c1-467c-a67f-360dd799f42f-utilities" (OuterVolumeSpecName: "utilities") pod "f5158f39-08c1-467c-a67f-360dd799f42f" (UID: "f5158f39-08c1-467c-a67f-360dd799f42f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.752345 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5158f39-08c1-467c-a67f-360dd799f42f-kube-api-access-bbkn7" (OuterVolumeSpecName: "kube-api-access-bbkn7") pod "f5158f39-08c1-467c-a67f-360dd799f42f" (UID: "f5158f39-08c1-467c-a67f-360dd799f42f"). InnerVolumeSpecName "kube-api-access-bbkn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.848635 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5158f39-08c1-467c-a67f-360dd799f42f-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.848678 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbkn7\" (UniqueName: \"kubernetes.io/projected/f5158f39-08c1-467c-a67f-360dd799f42f-kube-api-access-bbkn7\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.885610 4689 generic.go:334] "Generic (PLEG): container finished" podID="f5158f39-08c1-467c-a67f-360dd799f42f" containerID="f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761" exitCode=0 Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.885665 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsggs" event={"ID":"f5158f39-08c1-467c-a67f-360dd799f42f","Type":"ContainerDied","Data":"f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761"} Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.885702 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsggs" event={"ID":"f5158f39-08c1-467c-a67f-360dd799f42f","Type":"ContainerDied","Data":"8f7cf3daf054bdb9e4635fc3c39d8e15baf0a9f63e3be007cd71171eb034f0b5"} Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.885737 4689 scope.go:117] "RemoveContainer" containerID="f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.885926 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsggs" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.938560 4689 scope.go:117] "RemoveContainer" containerID="38e0b1ac35ced597c9c13d0e04cb654892d5175c42b15257741c01742bbf2984" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.944510 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5158f39-08c1-467c-a67f-360dd799f42f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5158f39-08c1-467c-a67f-360dd799f42f" (UID: "f5158f39-08c1-467c-a67f-360dd799f42f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.950981 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5158f39-08c1-467c-a67f-360dd799f42f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.982753 4689 scope.go:117] "RemoveContainer" containerID="5735fbaca5ce5b323cf225095180ea5d8accb129a8ae346fb8f77f49cf4d8fd7" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.996790 4689 scope.go:117] "RemoveContainer" containerID="f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761" Mar 07 04:36:25 crc kubenswrapper[4689]: E0307 04:36:25.997586 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761\": container with ID starting with f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761 not found: ID does not exist" containerID="f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.997627 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761"} err="failed to get container status \"f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761\": rpc error: code = NotFound desc = could not find container \"f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761\": container with ID starting with f3111cd0d7168d16fcf39b155a2a0fa04b758dc668fc7f5c860af2edf297b761 not found: ID does not exist" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.997659 4689 scope.go:117] "RemoveContainer" containerID="38e0b1ac35ced597c9c13d0e04cb654892d5175c42b15257741c01742bbf2984" Mar 07 04:36:25 crc kubenswrapper[4689]: E0307 04:36:25.997940 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e0b1ac35ced597c9c13d0e04cb654892d5175c42b15257741c01742bbf2984\": container with ID starting with 38e0b1ac35ced597c9c13d0e04cb654892d5175c42b15257741c01742bbf2984 not found: ID does not exist" containerID="38e0b1ac35ced597c9c13d0e04cb654892d5175c42b15257741c01742bbf2984" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.997959 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e0b1ac35ced597c9c13d0e04cb654892d5175c42b15257741c01742bbf2984"} err="failed to get container status \"38e0b1ac35ced597c9c13d0e04cb654892d5175c42b15257741c01742bbf2984\": rpc error: code = NotFound desc = could not find container \"38e0b1ac35ced597c9c13d0e04cb654892d5175c42b15257741c01742bbf2984\": container with ID starting with 38e0b1ac35ced597c9c13d0e04cb654892d5175c42b15257741c01742bbf2984 not found: ID does not exist" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.997971 4689 scope.go:117] "RemoveContainer" containerID="5735fbaca5ce5b323cf225095180ea5d8accb129a8ae346fb8f77f49cf4d8fd7" Mar 07 04:36:25 crc kubenswrapper[4689]: E0307 04:36:25.998320 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5735fbaca5ce5b323cf225095180ea5d8accb129a8ae346fb8f77f49cf4d8fd7\": container with ID starting with 5735fbaca5ce5b323cf225095180ea5d8accb129a8ae346fb8f77f49cf4d8fd7 not found: ID does not exist" containerID="5735fbaca5ce5b323cf225095180ea5d8accb129a8ae346fb8f77f49cf4d8fd7" Mar 07 04:36:25 crc kubenswrapper[4689]: I0307 04:36:25.998359 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5735fbaca5ce5b323cf225095180ea5d8accb129a8ae346fb8f77f49cf4d8fd7"} err="failed to get container status \"5735fbaca5ce5b323cf225095180ea5d8accb129a8ae346fb8f77f49cf4d8fd7\": rpc error: code = NotFound desc = could not find container \"5735fbaca5ce5b323cf225095180ea5d8accb129a8ae346fb8f77f49cf4d8fd7\": container with ID starting with 5735fbaca5ce5b323cf225095180ea5d8accb129a8ae346fb8f77f49cf4d8fd7 not found: ID does not exist" Mar 07 04:36:26 crc kubenswrapper[4689]: I0307 04:36:26.235710 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vsggs"] Mar 07 04:36:26 crc kubenswrapper[4689]: I0307 04:36:26.253292 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vsggs"] Mar 07 04:36:27 crc kubenswrapper[4689]: I0307 04:36:27.834085 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5158f39-08c1-467c-a67f-360dd799f42f" path="/var/lib/kubelet/pods/f5158f39-08c1-467c-a67f-360dd799f42f/volumes" Mar 07 04:36:29 crc kubenswrapper[4689]: I0307 04:36:29.189334 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:36:29 crc kubenswrapper[4689]: I0307 04:36:29.189398 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:36:29 crc kubenswrapper[4689]: I0307 04:36:29.189453 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:36:29 crc kubenswrapper[4689]: I0307 04:36:29.190116 4689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"095186d39ccb32197b5727728ef69f96ce62106ff83eff2af68654fa691615da"} pod="openshift-machine-config-operator/machine-config-daemon-dss5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 04:36:29 crc kubenswrapper[4689]: I0307 04:36:29.190200 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" containerID="cri-o://095186d39ccb32197b5727728ef69f96ce62106ff83eff2af68654fa691615da" gracePeriod=600 Mar 07 04:36:29 crc kubenswrapper[4689]: I0307 04:36:29.926441 4689 generic.go:334] "Generic (PLEG): container finished" podID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerID="095186d39ccb32197b5727728ef69f96ce62106ff83eff2af68654fa691615da" exitCode=0 Mar 07 04:36:29 crc kubenswrapper[4689]: I0307 04:36:29.926993 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerDied","Data":"095186d39ccb32197b5727728ef69f96ce62106ff83eff2af68654fa691615da"} Mar 07 04:36:29 crc kubenswrapper[4689]: I0307 04:36:29.927036 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerStarted","Data":"ae730408636ba5641da1384c81b782848c445de37ccd29b97d13a35866436afe"} Mar 07 04:36:29 crc kubenswrapper[4689]: I0307 04:36:29.927055 4689 scope.go:117] "RemoveContainer" containerID="9c811faf449bec22216350a82fb0e4edb8efb6f32a1e999aafd915dabcad4588" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.198996 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq"] Mar 07 04:36:30 crc kubenswrapper[4689]: E0307 04:36:30.199262 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf61d7b-9301-4e93-ba1f-60d19c9497d2" containerName="pull" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199274 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf61d7b-9301-4e93-ba1f-60d19c9497d2" containerName="pull" Mar 07 04:36:30 crc kubenswrapper[4689]: E0307 04:36:30.199286 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5158f39-08c1-467c-a67f-360dd799f42f" containerName="extract-utilities" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199292 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5158f39-08c1-467c-a67f-360dd799f42f" containerName="extract-utilities" Mar 07 04:36:30 crc kubenswrapper[4689]: E0307 04:36:30.199303 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a1d244-2c54-45fa-af37-993bb70ec9ed" containerName="pull" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199310 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a1d244-2c54-45fa-af37-993bb70ec9ed" containerName="pull" Mar 07 04:36:30 crc kubenswrapper[4689]: E0307 04:36:30.199318 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf61d7b-9301-4e93-ba1f-60d19c9497d2" containerName="extract" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199323 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf61d7b-9301-4e93-ba1f-60d19c9497d2" containerName="extract" Mar 07 04:36:30 crc kubenswrapper[4689]: E0307 04:36:30.199331 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf61d7b-9301-4e93-ba1f-60d19c9497d2" containerName="util" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199336 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf61d7b-9301-4e93-ba1f-60d19c9497d2" containerName="util" Mar 07 04:36:30 crc kubenswrapper[4689]: E0307 04:36:30.199346 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a1d244-2c54-45fa-af37-993bb70ec9ed" containerName="util" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199351 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a1d244-2c54-45fa-af37-993bb70ec9ed" containerName="util" Mar 07 04:36:30 crc kubenswrapper[4689]: E0307 04:36:30.199358 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5158f39-08c1-467c-a67f-360dd799f42f" containerName="registry-server" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199363 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5158f39-08c1-467c-a67f-360dd799f42f" containerName="registry-server" Mar 07 04:36:30 crc kubenswrapper[4689]: E0307 04:36:30.199370 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a1d244-2c54-45fa-af37-993bb70ec9ed" containerName="extract" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199376 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a1d244-2c54-45fa-af37-993bb70ec9ed" containerName="extract" Mar 07 04:36:30 crc kubenswrapper[4689]: E0307 04:36:30.199388 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5158f39-08c1-467c-a67f-360dd799f42f" containerName="extract-content" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199394 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5158f39-08c1-467c-a67f-360dd799f42f" containerName="extract-content" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199490 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a1d244-2c54-45fa-af37-993bb70ec9ed" containerName="extract" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199501 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5158f39-08c1-467c-a67f-360dd799f42f" containerName="registry-server" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199512 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf61d7b-9301-4e93-ba1f-60d19c9497d2" containerName="extract" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.199939 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.204351 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-7sfgv" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.205142 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/96b13de3-e5e2-456c-8b92-fb7adb492a65-apiservice-cert\") pod \"horizon-operator-controller-manager-5df65d59b6-8hmtq\" (UID: \"96b13de3-e5e2-456c-8b92-fb7adb492a65\") " pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.205336 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.205349 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/96b13de3-e5e2-456c-8b92-fb7adb492a65-webhook-cert\") pod \"horizon-operator-controller-manager-5df65d59b6-8hmtq\" (UID: \"96b13de3-e5e2-456c-8b92-fb7adb492a65\") " pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.205427 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv6lw\" (UniqueName: \"kubernetes.io/projected/96b13de3-e5e2-456c-8b92-fb7adb492a65-kube-api-access-dv6lw\") pod \"horizon-operator-controller-manager-5df65d59b6-8hmtq\" (UID: \"96b13de3-e5e2-456c-8b92-fb7adb492a65\") " pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.215198 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq"] Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.307112 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/96b13de3-e5e2-456c-8b92-fb7adb492a65-apiservice-cert\") pod \"horizon-operator-controller-manager-5df65d59b6-8hmtq\" (UID: \"96b13de3-e5e2-456c-8b92-fb7adb492a65\") " pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.307208 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/96b13de3-e5e2-456c-8b92-fb7adb492a65-webhook-cert\") pod \"horizon-operator-controller-manager-5df65d59b6-8hmtq\" (UID: \"96b13de3-e5e2-456c-8b92-fb7adb492a65\") " pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.308090 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv6lw\" (UniqueName: \"kubernetes.io/projected/96b13de3-e5e2-456c-8b92-fb7adb492a65-kube-api-access-dv6lw\") pod \"horizon-operator-controller-manager-5df65d59b6-8hmtq\" (UID: \"96b13de3-e5e2-456c-8b92-fb7adb492a65\") " pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.312547 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/96b13de3-e5e2-456c-8b92-fb7adb492a65-webhook-cert\") pod \"horizon-operator-controller-manager-5df65d59b6-8hmtq\" (UID: \"96b13de3-e5e2-456c-8b92-fb7adb492a65\") " pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.312893 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/96b13de3-e5e2-456c-8b92-fb7adb492a65-apiservice-cert\") pod \"horizon-operator-controller-manager-5df65d59b6-8hmtq\" (UID: \"96b13de3-e5e2-456c-8b92-fb7adb492a65\") " pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.328625 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv6lw\" (UniqueName: \"kubernetes.io/projected/96b13de3-e5e2-456c-8b92-fb7adb492a65-kube-api-access-dv6lw\") pod \"horizon-operator-controller-manager-5df65d59b6-8hmtq\" (UID: \"96b13de3-e5e2-456c-8b92-fb7adb492a65\") " pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.515397 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:30 crc kubenswrapper[4689]: I0307 04:36:30.965565 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq"] Mar 07 04:36:31 crc kubenswrapper[4689]: I0307 04:36:31.951136 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" event={"ID":"96b13de3-e5e2-456c-8b92-fb7adb492a65","Type":"ContainerStarted","Data":"347a68184e1ee6db9eb433f4e66a1ac3fa497d331be6cc65f5522e53972c9eb4"} Mar 07 04:36:34 crc kubenswrapper[4689]: I0307 04:36:34.978721 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" event={"ID":"96b13de3-e5e2-456c-8b92-fb7adb492a65","Type":"ContainerStarted","Data":"06a42fcf045bb4d804bfd2cc8d6f64f7d2524dd4a013aa51c94596af94c53298"} Mar 07 04:36:34 crc kubenswrapper[4689]: I0307 04:36:34.979390 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:35 crc kubenswrapper[4689]: I0307 04:36:35.004518 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" podStartSLOduration=1.421374384 podStartE2EDuration="5.004491406s" podCreationTimestamp="2026-03-07 04:36:30 +0000 UTC" firstStartedPulling="2026-03-07 04:36:30.987299658 +0000 UTC m=+1036.033683157" lastFinishedPulling="2026-03-07 04:36:34.57041668 +0000 UTC m=+1039.616800179" observedRunningTime="2026-03-07 04:36:34.998132745 +0000 UTC m=+1040.044516254" watchObservedRunningTime="2026-03-07 04:36:35.004491406 +0000 UTC m=+1040.050874935" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.630199 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4"] Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.631471 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.633397 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8vqd6" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.635838 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.641292 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4"] Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.709969 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpt2n\" (UniqueName: \"kubernetes.io/projected/096a01ec-b76b-4553-aa1b-91b0282c3470-kube-api-access-fpt2n\") pod \"swift-operator-controller-manager-744fbfddcd-gk4h4\" (UID: \"096a01ec-b76b-4553-aa1b-91b0282c3470\") " pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.710113 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/096a01ec-b76b-4553-aa1b-91b0282c3470-apiservice-cert\") pod \"swift-operator-controller-manager-744fbfddcd-gk4h4\" (UID: \"096a01ec-b76b-4553-aa1b-91b0282c3470\") " pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.710307 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/096a01ec-b76b-4553-aa1b-91b0282c3470-webhook-cert\") pod \"swift-operator-controller-manager-744fbfddcd-gk4h4\" (UID: \"096a01ec-b76b-4553-aa1b-91b0282c3470\") " pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.812012 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/096a01ec-b76b-4553-aa1b-91b0282c3470-webhook-cert\") pod \"swift-operator-controller-manager-744fbfddcd-gk4h4\" (UID: \"096a01ec-b76b-4553-aa1b-91b0282c3470\") " pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.812186 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpt2n\" (UniqueName: \"kubernetes.io/projected/096a01ec-b76b-4553-aa1b-91b0282c3470-kube-api-access-fpt2n\") pod \"swift-operator-controller-manager-744fbfddcd-gk4h4\" (UID: \"096a01ec-b76b-4553-aa1b-91b0282c3470\") " pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.812251 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/096a01ec-b76b-4553-aa1b-91b0282c3470-apiservice-cert\") pod \"swift-operator-controller-manager-744fbfddcd-gk4h4\" (UID: \"096a01ec-b76b-4553-aa1b-91b0282c3470\") " pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.819644 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/096a01ec-b76b-4553-aa1b-91b0282c3470-webhook-cert\") pod \"swift-operator-controller-manager-744fbfddcd-gk4h4\" (UID: \"096a01ec-b76b-4553-aa1b-91b0282c3470\") " pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.826779 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/096a01ec-b76b-4553-aa1b-91b0282c3470-apiservice-cert\") pod \"swift-operator-controller-manager-744fbfddcd-gk4h4\" (UID: \"096a01ec-b76b-4553-aa1b-91b0282c3470\") " pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.830762 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpt2n\" (UniqueName: \"kubernetes.io/projected/096a01ec-b76b-4553-aa1b-91b0282c3470-kube-api-access-fpt2n\") pod \"swift-operator-controller-manager-744fbfddcd-gk4h4\" (UID: \"096a01ec-b76b-4553-aa1b-91b0282c3470\") " pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:37 crc kubenswrapper[4689]: I0307 04:36:37.948838 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:38 crc kubenswrapper[4689]: I0307 04:36:38.423313 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4"] Mar 07 04:36:39 crc kubenswrapper[4689]: I0307 04:36:39.015191 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" event={"ID":"096a01ec-b76b-4553-aa1b-91b0282c3470","Type":"ContainerStarted","Data":"088a69d6c690f0e2371b123c9c6dd232095ac0343cfb407fbd14f4cb940cb2bf"} Mar 07 04:36:40 crc kubenswrapper[4689]: I0307 04:36:40.525992 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5df65d59b6-8hmtq" Mar 07 04:36:42 crc kubenswrapper[4689]: I0307 04:36:42.044649 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" event={"ID":"096a01ec-b76b-4553-aa1b-91b0282c3470","Type":"ContainerStarted","Data":"96c0d27aae27223b99c7a936596e5ce31f081a158d76a7845e6c32ffb2829466"} Mar 07 04:36:42 crc kubenswrapper[4689]: I0307 04:36:42.045147 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:42 crc kubenswrapper[4689]: I0307 04:36:42.075880 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" podStartSLOduration=2.053001927 podStartE2EDuration="5.075858906s" podCreationTimestamp="2026-03-07 04:36:37 +0000 UTC" firstStartedPulling="2026-03-07 04:36:38.431148236 +0000 UTC m=+1043.477531725" lastFinishedPulling="2026-03-07 04:36:41.454005195 +0000 UTC m=+1046.500388704" observedRunningTime="2026-03-07 04:36:42.068912539 +0000 UTC m=+1047.115296038" watchObservedRunningTime="2026-03-07 04:36:42.075858906 +0000 UTC m=+1047.122242395" Mar 07 04:36:47 crc kubenswrapper[4689]: I0307 04:36:47.952643 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:36:48 crc kubenswrapper[4689]: I0307 04:36:48.685454 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:36:50 crc kubenswrapper[4689]: I0307 04:36:50.806314 4689 scope.go:117] "RemoveContainer" containerID="7c0b064a3b0f3ef5d50efe29dd1b27c172544ea401153f3747007d9d139a8a3c" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.074762 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.107530 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.111067 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-9z4lw" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.112123 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.112184 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.112400 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.114465 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.198270 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjfbz\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-kube-api-access-mjfbz\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.198315 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.198365 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-lock\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.198391 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.198456 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-cache\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.299373 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjfbz\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-kube-api-access-mjfbz\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.299435 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.299469 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-lock\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.299493 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.299542 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-cache\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.300052 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-cache\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.300491 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: E0307 04:36:51.300707 4689 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 04:36:51 crc kubenswrapper[4689]: E0307 04:36:51.301135 4689 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 04:36:51 crc kubenswrapper[4689]: E0307 04:36:51.301239 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift podName:72bf7dd5-1e66-47a7-ae3f-477fcfb02742 nodeName:}" failed. No retries permitted until 2026-03-07 04:36:51.801213475 +0000 UTC m=+1056.847596964 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift") pod "swift-storage-0" (UID: "72bf7dd5-1e66-47a7-ae3f-477fcfb02742") : configmap "swift-ring-files" not found Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.300746 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-lock\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.334122 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjfbz\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-kube-api-access-mjfbz\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.334452 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.538922 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-ldm2v"] Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.539947 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.550557 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.551223 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.551367 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.557311 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-ldm2v"] Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.603585 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/82016afa-d3dc-4bd8-ae60-db43a0960865-swiftconf\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.603659 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56j59\" (UniqueName: \"kubernetes.io/projected/82016afa-d3dc-4bd8-ae60-db43a0960865-kube-api-access-56j59\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.603764 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/82016afa-d3dc-4bd8-ae60-db43a0960865-etc-swift\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.603791 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/82016afa-d3dc-4bd8-ae60-db43a0960865-dispersionconf\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.603832 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82016afa-d3dc-4bd8-ae60-db43a0960865-scripts\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.603904 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/82016afa-d3dc-4bd8-ae60-db43a0960865-ring-data-devices\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.705535 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/82016afa-d3dc-4bd8-ae60-db43a0960865-etc-swift\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.705598 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/82016afa-d3dc-4bd8-ae60-db43a0960865-dispersionconf\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.705639 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82016afa-d3dc-4bd8-ae60-db43a0960865-scripts\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.705666 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/82016afa-d3dc-4bd8-ae60-db43a0960865-ring-data-devices\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.705719 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/82016afa-d3dc-4bd8-ae60-db43a0960865-swiftconf\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.705749 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56j59\" (UniqueName: \"kubernetes.io/projected/82016afa-d3dc-4bd8-ae60-db43a0960865-kube-api-access-56j59\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.706072 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/82016afa-d3dc-4bd8-ae60-db43a0960865-etc-swift\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.706648 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82016afa-d3dc-4bd8-ae60-db43a0960865-scripts\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.707098 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/82016afa-d3dc-4bd8-ae60-db43a0960865-ring-data-devices\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.709454 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/82016afa-d3dc-4bd8-ae60-db43a0960865-dispersionconf\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.709739 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/82016afa-d3dc-4bd8-ae60-db43a0960865-swiftconf\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.735949 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56j59\" (UniqueName: \"kubernetes.io/projected/82016afa-d3dc-4bd8-ae60-db43a0960865-kube-api-access-56j59\") pod \"swift-ring-rebalance-ldm2v\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.806745 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:51 crc kubenswrapper[4689]: E0307 04:36:51.806981 4689 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 04:36:51 crc kubenswrapper[4689]: E0307 04:36:51.807015 4689 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 04:36:51 crc kubenswrapper[4689]: E0307 04:36:51.807088 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift podName:72bf7dd5-1e66-47a7-ae3f-477fcfb02742 nodeName:}" failed. No retries permitted until 2026-03-07 04:36:52.807065183 +0000 UTC m=+1057.853448682 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift") pod "swift-storage-0" (UID: "72bf7dd5-1e66-47a7-ae3f-477fcfb02742") : configmap "swift-ring-files" not found Mar 07 04:36:51 crc kubenswrapper[4689]: I0307 04:36:51.859501 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.291998 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-ldm2v"] Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.504228 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-52nzz"] Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.505101 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-52nzz" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.507299 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-2nh69" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.522342 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-52nzz"] Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.575133 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc"] Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.577328 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.586028 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc"] Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.617313 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwb4\" (UniqueName: \"kubernetes.io/projected/40890096-1ecf-4f34-8dff-db50ab7e3596-kube-api-access-npwb4\") pod \"glance-operator-index-52nzz\" (UID: \"40890096-1ecf-4f34-8dff-db50ab7e3596\") " pod="openstack-operators/glance-operator-index-52nzz" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.719254 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmr4v\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-kube-api-access-jmr4v\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.719560 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-log-httpd\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.719630 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npwb4\" (UniqueName: \"kubernetes.io/projected/40890096-1ecf-4f34-8dff-db50ab7e3596-kube-api-access-npwb4\") pod \"glance-operator-index-52nzz\" (UID: \"40890096-1ecf-4f34-8dff-db50ab7e3596\") " pod="openstack-operators/glance-operator-index-52nzz" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.720069 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.720113 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-run-httpd\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.720143 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-config-data\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.748569 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwb4\" (UniqueName: \"kubernetes.io/projected/40890096-1ecf-4f34-8dff-db50ab7e3596-kube-api-access-npwb4\") pod \"glance-operator-index-52nzz\" (UID: \"40890096-1ecf-4f34-8dff-db50ab7e3596\") " pod="openstack-operators/glance-operator-index-52nzz" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.821846 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-log-httpd\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.821911 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.821938 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-run-httpd\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.821963 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-config-data\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.822000 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmr4v\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-kube-api-access-jmr4v\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.822044 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:52 crc kubenswrapper[4689]: E0307 04:36:52.822243 4689 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 04:36:52 crc kubenswrapper[4689]: E0307 04:36:52.822260 4689 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 04:36:52 crc kubenswrapper[4689]: E0307 04:36:52.822310 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift podName:72bf7dd5-1e66-47a7-ae3f-477fcfb02742 nodeName:}" failed. No retries permitted until 2026-03-07 04:36:54.822294474 +0000 UTC m=+1059.868677973 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift") pod "swift-storage-0" (UID: "72bf7dd5-1e66-47a7-ae3f-477fcfb02742") : configmap "swift-ring-files" not found Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.823019 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-log-httpd\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: E0307 04:36:52.823092 4689 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 04:36:52 crc kubenswrapper[4689]: E0307 04:36:52.823104 4689 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc: configmap "swift-ring-files" not found Mar 07 04:36:52 crc kubenswrapper[4689]: E0307 04:36:52.823135 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift podName:6e87b1f1-2509-4c3b-9c4e-c034d697f49b nodeName:}" failed. No retries permitted until 2026-03-07 04:36:53.323122617 +0000 UTC m=+1058.369506126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift") pod "swift-proxy-7c5699d58c-8dzvc" (UID: "6e87b1f1-2509-4c3b-9c4e-c034d697f49b") : configmap "swift-ring-files" not found Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.823434 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-run-httpd\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.824551 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-52nzz" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.828423 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-config-data\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:52 crc kubenswrapper[4689]: I0307 04:36:52.844586 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmr4v\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-kube-api-access-jmr4v\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:53 crc kubenswrapper[4689]: I0307 04:36:53.145787 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" event={"ID":"82016afa-d3dc-4bd8-ae60-db43a0960865","Type":"ContainerStarted","Data":"b46e51289c1b90813e542ffafa7d9356e36169211c287d1db15f03ab981f8433"} Mar 07 04:36:53 crc kubenswrapper[4689]: I0307 04:36:53.334163 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:53 crc kubenswrapper[4689]: E0307 04:36:53.334380 4689 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 04:36:53 crc kubenswrapper[4689]: E0307 04:36:53.334393 4689 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc: configmap "swift-ring-files" not found Mar 07 04:36:53 crc kubenswrapper[4689]: E0307 04:36:53.334435 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift podName:6e87b1f1-2509-4c3b-9c4e-c034d697f49b nodeName:}" failed. No retries permitted until 2026-03-07 04:36:54.334421599 +0000 UTC m=+1059.380805088 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift") pod "swift-proxy-7c5699d58c-8dzvc" (UID: "6e87b1f1-2509-4c3b-9c4e-c034d697f49b") : configmap "swift-ring-files" not found Mar 07 04:36:53 crc kubenswrapper[4689]: I0307 04:36:53.354937 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-52nzz"] Mar 07 04:36:54 crc kubenswrapper[4689]: I0307 04:36:54.157544 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-52nzz" event={"ID":"40890096-1ecf-4f34-8dff-db50ab7e3596","Type":"ContainerStarted","Data":"e243fa7ad9700b02c5de1312b5f6aa9d15d773b8f816aedcd08197133cc1f7e1"} Mar 07 04:36:54 crc kubenswrapper[4689]: I0307 04:36:54.351573 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:54 crc kubenswrapper[4689]: E0307 04:36:54.351802 4689 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 04:36:54 crc kubenswrapper[4689]: E0307 04:36:54.351855 4689 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc: configmap "swift-ring-files" not found Mar 07 04:36:54 crc kubenswrapper[4689]: E0307 04:36:54.351931 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift podName:6e87b1f1-2509-4c3b-9c4e-c034d697f49b nodeName:}" failed. No retries permitted until 2026-03-07 04:36:56.351906474 +0000 UTC m=+1061.398289973 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift") pod "swift-proxy-7c5699d58c-8dzvc" (UID: "6e87b1f1-2509-4c3b-9c4e-c034d697f49b") : configmap "swift-ring-files" not found Mar 07 04:36:54 crc kubenswrapper[4689]: I0307 04:36:54.858924 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:54 crc kubenswrapper[4689]: E0307 04:36:54.859223 4689 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 04:36:54 crc kubenswrapper[4689]: E0307 04:36:54.859347 4689 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 04:36:54 crc kubenswrapper[4689]: E0307 04:36:54.859401 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift podName:72bf7dd5-1e66-47a7-ae3f-477fcfb02742 nodeName:}" failed. No retries permitted until 2026-03-07 04:36:58.859381976 +0000 UTC m=+1063.905765465 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift") pod "swift-storage-0" (UID: "72bf7dd5-1e66-47a7-ae3f-477fcfb02742") : configmap "swift-ring-files" not found Mar 07 04:36:56 crc kubenswrapper[4689]: I0307 04:36:56.379878 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:36:56 crc kubenswrapper[4689]: E0307 04:36:56.380347 4689 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 04:36:56 crc kubenswrapper[4689]: E0307 04:36:56.380361 4689 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc: configmap "swift-ring-files" not found Mar 07 04:36:56 crc kubenswrapper[4689]: E0307 04:36:56.380398 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift podName:6e87b1f1-2509-4c3b-9c4e-c034d697f49b nodeName:}" failed. No retries permitted until 2026-03-07 04:37:00.380385464 +0000 UTC m=+1065.426768943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift") pod "swift-proxy-7c5699d58c-8dzvc" (UID: "6e87b1f1-2509-4c3b-9c4e-c034d697f49b") : configmap "swift-ring-files" not found Mar 07 04:36:56 crc kubenswrapper[4689]: I0307 04:36:56.895493 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-52nzz"] Mar 07 04:36:57 crc kubenswrapper[4689]: I0307 04:36:57.504238 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-vr96j"] Mar 07 04:36:57 crc kubenswrapper[4689]: I0307 04:36:57.505216 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-vr96j" Mar 07 04:36:57 crc kubenswrapper[4689]: I0307 04:36:57.514405 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-vr96j"] Mar 07 04:36:57 crc kubenswrapper[4689]: I0307 04:36:57.701120 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xgkw\" (UniqueName: \"kubernetes.io/projected/bd8b0d1d-32da-409d-9453-bef0c8ca65f1-kube-api-access-6xgkw\") pod \"glance-operator-index-vr96j\" (UID: \"bd8b0d1d-32da-409d-9453-bef0c8ca65f1\") " pod="openstack-operators/glance-operator-index-vr96j" Mar 07 04:36:57 crc kubenswrapper[4689]: I0307 04:36:57.804120 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xgkw\" (UniqueName: \"kubernetes.io/projected/bd8b0d1d-32da-409d-9453-bef0c8ca65f1-kube-api-access-6xgkw\") pod \"glance-operator-index-vr96j\" (UID: \"bd8b0d1d-32da-409d-9453-bef0c8ca65f1\") " pod="openstack-operators/glance-operator-index-vr96j" Mar 07 04:36:57 crc kubenswrapper[4689]: I0307 04:36:57.837672 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xgkw\" (UniqueName: \"kubernetes.io/projected/bd8b0d1d-32da-409d-9453-bef0c8ca65f1-kube-api-access-6xgkw\") pod \"glance-operator-index-vr96j\" (UID: \"bd8b0d1d-32da-409d-9453-bef0c8ca65f1\") " pod="openstack-operators/glance-operator-index-vr96j" Mar 07 04:36:58 crc kubenswrapper[4689]: I0307 04:36:58.132753 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-vr96j" Mar 07 04:36:58 crc kubenswrapper[4689]: I0307 04:36:58.877154 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-vr96j"] Mar 07 04:36:58 crc kubenswrapper[4689]: I0307 04:36:58.921835 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:36:58 crc kubenswrapper[4689]: E0307 04:36:58.922103 4689 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 04:36:58 crc kubenswrapper[4689]: E0307 04:36:58.922144 4689 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 04:36:58 crc kubenswrapper[4689]: E0307 04:36:58.922260 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift podName:72bf7dd5-1e66-47a7-ae3f-477fcfb02742 nodeName:}" failed. No retries permitted until 2026-03-07 04:37:06.922231493 +0000 UTC m=+1071.968615012 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift") pod "swift-storage-0" (UID: "72bf7dd5-1e66-47a7-ae3f-477fcfb02742") : configmap "swift-ring-files" not found Mar 07 04:36:59 crc kubenswrapper[4689]: I0307 04:36:59.194148 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-vr96j" event={"ID":"bd8b0d1d-32da-409d-9453-bef0c8ca65f1","Type":"ContainerStarted","Data":"69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5"} Mar 07 04:36:59 crc kubenswrapper[4689]: I0307 04:36:59.194597 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-vr96j" event={"ID":"bd8b0d1d-32da-409d-9453-bef0c8ca65f1","Type":"ContainerStarted","Data":"1b5523618aff16ba392613154f43436eb9653012922a39cfa69da84f1170d8c9"} Mar 07 04:36:59 crc kubenswrapper[4689]: I0307 04:36:59.198044 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-52nzz" event={"ID":"40890096-1ecf-4f34-8dff-db50ab7e3596","Type":"ContainerStarted","Data":"01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9"} Mar 07 04:36:59 crc kubenswrapper[4689]: I0307 04:36:59.198228 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-index-52nzz" podUID="40890096-1ecf-4f34-8dff-db50ab7e3596" containerName="registry-server" containerID="cri-o://01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9" gracePeriod=2 Mar 07 04:36:59 crc kubenswrapper[4689]: I0307 04:36:59.200123 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" event={"ID":"82016afa-d3dc-4bd8-ae60-db43a0960865","Type":"ContainerStarted","Data":"ad307d5a4ec8c0a23c00d2d81ea49346a33388014013494e5a8093558af45270"} Mar 07 04:36:59 crc kubenswrapper[4689]: I0307 04:36:59.213500 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-vr96j" podStartSLOduration=2.160282392 podStartE2EDuration="2.213479364s" podCreationTimestamp="2026-03-07 04:36:57 +0000 UTC" firstStartedPulling="2026-03-07 04:36:58.880243133 +0000 UTC m=+1063.926626632" lastFinishedPulling="2026-03-07 04:36:58.933440105 +0000 UTC m=+1063.979823604" observedRunningTime="2026-03-07 04:36:59.211133531 +0000 UTC m=+1064.257517020" watchObservedRunningTime="2026-03-07 04:36:59.213479364 +0000 UTC m=+1064.259862863" Mar 07 04:36:59 crc kubenswrapper[4689]: I0307 04:36:59.228153 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-52nzz" podStartSLOduration=2.142599857 podStartE2EDuration="7.228136999s" podCreationTimestamp="2026-03-07 04:36:52 +0000 UTC" firstStartedPulling="2026-03-07 04:36:53.363747909 +0000 UTC m=+1058.410131408" lastFinishedPulling="2026-03-07 04:36:58.449285061 +0000 UTC m=+1063.495668550" observedRunningTime="2026-03-07 04:36:59.22666383 +0000 UTC m=+1064.273047319" watchObservedRunningTime="2026-03-07 04:36:59.228136999 +0000 UTC m=+1064.274520488" Mar 07 04:36:59 crc kubenswrapper[4689]: I0307 04:36:59.251814 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" podStartSLOduration=4.653805604 podStartE2EDuration="8.251790386s" podCreationTimestamp="2026-03-07 04:36:51 +0000 UTC" firstStartedPulling="2026-03-07 04:36:52.287554479 +0000 UTC m=+1057.333937998" lastFinishedPulling="2026-03-07 04:36:55.885539291 +0000 UTC m=+1060.931922780" observedRunningTime="2026-03-07 04:36:59.245522147 +0000 UTC m=+1064.291905636" watchObservedRunningTime="2026-03-07 04:36:59.251790386 +0000 UTC m=+1064.298173885" Mar 07 04:36:59 crc kubenswrapper[4689]: I0307 04:36:59.734160 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-52nzz" Mar 07 04:36:59 crc kubenswrapper[4689]: I0307 04:36:59.837333 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npwb4\" (UniqueName: \"kubernetes.io/projected/40890096-1ecf-4f34-8dff-db50ab7e3596-kube-api-access-npwb4\") pod \"40890096-1ecf-4f34-8dff-db50ab7e3596\" (UID: \"40890096-1ecf-4f34-8dff-db50ab7e3596\") " Mar 07 04:36:59 crc kubenswrapper[4689]: I0307 04:36:59.843685 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40890096-1ecf-4f34-8dff-db50ab7e3596-kube-api-access-npwb4" (OuterVolumeSpecName: "kube-api-access-npwb4") pod "40890096-1ecf-4f34-8dff-db50ab7e3596" (UID: "40890096-1ecf-4f34-8dff-db50ab7e3596"). InnerVolumeSpecName "kube-api-access-npwb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:36:59 crc kubenswrapper[4689]: I0307 04:36:59.939649 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npwb4\" (UniqueName: \"kubernetes.io/projected/40890096-1ecf-4f34-8dff-db50ab7e3596-kube-api-access-npwb4\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:00 crc kubenswrapper[4689]: I0307 04:37:00.210140 4689 generic.go:334] "Generic (PLEG): container finished" podID="40890096-1ecf-4f34-8dff-db50ab7e3596" containerID="01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9" exitCode=0 Mar 07 04:37:00 crc kubenswrapper[4689]: I0307 04:37:00.210244 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-52nzz" Mar 07 04:37:00 crc kubenswrapper[4689]: I0307 04:37:00.210235 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-52nzz" event={"ID":"40890096-1ecf-4f34-8dff-db50ab7e3596","Type":"ContainerDied","Data":"01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9"} Mar 07 04:37:00 crc kubenswrapper[4689]: I0307 04:37:00.210467 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-52nzz" event={"ID":"40890096-1ecf-4f34-8dff-db50ab7e3596","Type":"ContainerDied","Data":"e243fa7ad9700b02c5de1312b5f6aa9d15d773b8f816aedcd08197133cc1f7e1"} Mar 07 04:37:00 crc kubenswrapper[4689]: I0307 04:37:00.210507 4689 scope.go:117] "RemoveContainer" containerID="01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9" Mar 07 04:37:00 crc kubenswrapper[4689]: I0307 04:37:00.227479 4689 scope.go:117] "RemoveContainer" containerID="01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9" Mar 07 04:37:00 crc kubenswrapper[4689]: E0307 04:37:00.227885 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9\": container with ID starting with 01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9 not found: ID does not exist" containerID="01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9" Mar 07 04:37:00 crc kubenswrapper[4689]: I0307 04:37:00.227923 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9"} err="failed to get container status \"01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9\": rpc error: code = NotFound desc = could not find container \"01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9\": container with ID starting with 01a962997eb8fe69c282e46e8c840bf2e4377c13996de5e99f982cfebdcedaf9 not found: ID does not exist" Mar 07 04:37:00 crc kubenswrapper[4689]: I0307 04:37:00.263421 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-52nzz"] Mar 07 04:37:00 crc kubenswrapper[4689]: I0307 04:37:00.269547 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-index-52nzz"] Mar 07 04:37:00 crc kubenswrapper[4689]: E0307 04:37:00.373272 4689 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40890096_1ecf_4f34_8dff_db50ab7e3596.slice/crio-e243fa7ad9700b02c5de1312b5f6aa9d15d773b8f816aedcd08197133cc1f7e1\": RecentStats: unable to find data in memory cache]" Mar 07 04:37:00 crc kubenswrapper[4689]: I0307 04:37:00.455914 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:37:00 crc kubenswrapper[4689]: E0307 04:37:00.456245 4689 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 04:37:00 crc kubenswrapper[4689]: E0307 04:37:00.456271 4689 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc: configmap "swift-ring-files" not found Mar 07 04:37:00 crc kubenswrapper[4689]: E0307 04:37:00.456349 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift podName:6e87b1f1-2509-4c3b-9c4e-c034d697f49b nodeName:}" failed. No retries permitted until 2026-03-07 04:37:08.456325324 +0000 UTC m=+1073.502708863 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift") pod "swift-proxy-7c5699d58c-8dzvc" (UID: "6e87b1f1-2509-4c3b-9c4e-c034d697f49b") : configmap "swift-ring-files" not found Mar 07 04:37:01 crc kubenswrapper[4689]: I0307 04:37:01.841445 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40890096-1ecf-4f34-8dff-db50ab7e3596" path="/var/lib/kubelet/pods/40890096-1ecf-4f34-8dff-db50ab7e3596/volumes" Mar 07 04:37:05 crc kubenswrapper[4689]: I0307 04:37:05.249555 4689 generic.go:334] "Generic (PLEG): container finished" podID="82016afa-d3dc-4bd8-ae60-db43a0960865" containerID="ad307d5a4ec8c0a23c00d2d81ea49346a33388014013494e5a8093558af45270" exitCode=0 Mar 07 04:37:05 crc kubenswrapper[4689]: I0307 04:37:05.249656 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" event={"ID":"82016afa-d3dc-4bd8-ae60-db43a0960865","Type":"ContainerDied","Data":"ad307d5a4ec8c0a23c00d2d81ea49346a33388014013494e5a8093558af45270"} Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.530313 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.573128 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/82016afa-d3dc-4bd8-ae60-db43a0960865-ring-data-devices\") pod \"82016afa-d3dc-4bd8-ae60-db43a0960865\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.573224 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/82016afa-d3dc-4bd8-ae60-db43a0960865-swiftconf\") pod \"82016afa-d3dc-4bd8-ae60-db43a0960865\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.573334 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/82016afa-d3dc-4bd8-ae60-db43a0960865-dispersionconf\") pod \"82016afa-d3dc-4bd8-ae60-db43a0960865\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.573374 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82016afa-d3dc-4bd8-ae60-db43a0960865-scripts\") pod \"82016afa-d3dc-4bd8-ae60-db43a0960865\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.573418 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/82016afa-d3dc-4bd8-ae60-db43a0960865-etc-swift\") pod \"82016afa-d3dc-4bd8-ae60-db43a0960865\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.573450 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56j59\" (UniqueName: \"kubernetes.io/projected/82016afa-d3dc-4bd8-ae60-db43a0960865-kube-api-access-56j59\") pod \"82016afa-d3dc-4bd8-ae60-db43a0960865\" (UID: \"82016afa-d3dc-4bd8-ae60-db43a0960865\") " Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.575946 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82016afa-d3dc-4bd8-ae60-db43a0960865-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "82016afa-d3dc-4bd8-ae60-db43a0960865" (UID: "82016afa-d3dc-4bd8-ae60-db43a0960865"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.576254 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82016afa-d3dc-4bd8-ae60-db43a0960865-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "82016afa-d3dc-4bd8-ae60-db43a0960865" (UID: "82016afa-d3dc-4bd8-ae60-db43a0960865"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.580134 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82016afa-d3dc-4bd8-ae60-db43a0960865-kube-api-access-56j59" (OuterVolumeSpecName: "kube-api-access-56j59") pod "82016afa-d3dc-4bd8-ae60-db43a0960865" (UID: "82016afa-d3dc-4bd8-ae60-db43a0960865"). InnerVolumeSpecName "kube-api-access-56j59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.582420 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82016afa-d3dc-4bd8-ae60-db43a0960865-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "82016afa-d3dc-4bd8-ae60-db43a0960865" (UID: "82016afa-d3dc-4bd8-ae60-db43a0960865"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.592043 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82016afa-d3dc-4bd8-ae60-db43a0960865-scripts" (OuterVolumeSpecName: "scripts") pod "82016afa-d3dc-4bd8-ae60-db43a0960865" (UID: "82016afa-d3dc-4bd8-ae60-db43a0960865"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.603583 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82016afa-d3dc-4bd8-ae60-db43a0960865-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "82016afa-d3dc-4bd8-ae60-db43a0960865" (UID: "82016afa-d3dc-4bd8-ae60-db43a0960865"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.675007 4689 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/82016afa-d3dc-4bd8-ae60-db43a0960865-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.675253 4689 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/82016afa-d3dc-4bd8-ae60-db43a0960865-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.675340 4689 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/82016afa-d3dc-4bd8-ae60-db43a0960865-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.675401 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82016afa-d3dc-4bd8-ae60-db43a0960865-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.675458 4689 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/82016afa-d3dc-4bd8-ae60-db43a0960865-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.675512 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56j59\" (UniqueName: \"kubernetes.io/projected/82016afa-d3dc-4bd8-ae60-db43a0960865-kube-api-access-56j59\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.979251 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:37:06 crc kubenswrapper[4689]: I0307 04:37:06.989754 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift\") pod \"swift-storage-0\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:37:07 crc kubenswrapper[4689]: I0307 04:37:07.039602 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:37:07 crc kubenswrapper[4689]: I0307 04:37:07.289682 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" event={"ID":"82016afa-d3dc-4bd8-ae60-db43a0960865","Type":"ContainerDied","Data":"b46e51289c1b90813e542ffafa7d9356e36169211c287d1db15f03ab981f8433"} Mar 07 04:37:07 crc kubenswrapper[4689]: I0307 04:37:07.289991 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b46e51289c1b90813e542ffafa7d9356e36169211c287d1db15f03ab981f8433" Mar 07 04:37:07 crc kubenswrapper[4689]: I0307 04:37:07.289888 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-ldm2v" Mar 07 04:37:07 crc kubenswrapper[4689]: I0307 04:37:07.593716 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Mar 07 04:37:07 crc kubenswrapper[4689]: W0307 04:37:07.602418 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72bf7dd5_1e66_47a7_ae3f_477fcfb02742.slice/crio-8a0e80d31c1c06f9c555faba84f7fad9db38e5c920bc01a3b1c6d095ca12ab39 WatchSource:0}: Error finding container 8a0e80d31c1c06f9c555faba84f7fad9db38e5c920bc01a3b1c6d095ca12ab39: Status 404 returned error can't find the container with id 8a0e80d31c1c06f9c555faba84f7fad9db38e5c920bc01a3b1c6d095ca12ab39 Mar 07 04:37:08 crc kubenswrapper[4689]: I0307 04:37:08.133884 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-vr96j" Mar 07 04:37:08 crc kubenswrapper[4689]: I0307 04:37:08.134259 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/glance-operator-index-vr96j" Mar 07 04:37:08 crc kubenswrapper[4689]: I0307 04:37:08.181460 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/glance-operator-index-vr96j" Mar 07 04:37:08 crc kubenswrapper[4689]: I0307 04:37:08.301001 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"8a0e80d31c1c06f9c555faba84f7fad9db38e5c920bc01a3b1c6d095ca12ab39"} Mar 07 04:37:08 crc kubenswrapper[4689]: I0307 04:37:08.369086 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-index-vr96j" Mar 07 04:37:08 crc kubenswrapper[4689]: I0307 04:37:08.535540 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:37:08 crc kubenswrapper[4689]: I0307 04:37:08.543409 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift\") pod \"swift-proxy-7c5699d58c-8dzvc\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:37:08 crc kubenswrapper[4689]: I0307 04:37:08.803274 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:37:09 crc kubenswrapper[4689]: I0307 04:37:09.249093 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc"] Mar 07 04:37:09 crc kubenswrapper[4689]: W0307 04:37:09.257672 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e87b1f1_2509_4c3b_9c4e_c034d697f49b.slice/crio-d2d58c6aff3ee5415fcd36b566734413bd182221febb561826600f2edd68e397 WatchSource:0}: Error finding container d2d58c6aff3ee5415fcd36b566734413bd182221febb561826600f2edd68e397: Status 404 returned error can't find the container with id d2d58c6aff3ee5415fcd36b566734413bd182221febb561826600f2edd68e397 Mar 07 04:37:09 crc kubenswrapper[4689]: I0307 04:37:09.315539 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" event={"ID":"6e87b1f1-2509-4c3b-9c4e-c034d697f49b","Type":"ContainerStarted","Data":"d2d58c6aff3ee5415fcd36b566734413bd182221febb561826600f2edd68e397"} Mar 07 04:37:09 crc kubenswrapper[4689]: I0307 04:37:09.319387 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"4cf74cb6827c9d9ba68e8c8dfa337659418d093ae76fce1056d4f84b43758ab5"} Mar 07 04:37:09 crc kubenswrapper[4689]: I0307 04:37:09.319417 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"413ff247560a52a36969f0cf2f05c5b652d77df05c0d4413b58fcf079e14f38c"} Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.335785 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"70864ec57f40a96c0cbd682f99e5cc28caf7680eef12aaeebbb5fef77b84ca71"} Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.336124 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"95c6d4b787a84767360101ff6c8db1dcbf368d75db58bcc4657444d42e1121e2"} Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.337090 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr"] Mar 07 04:37:10 crc kubenswrapper[4689]: E0307 04:37:10.337345 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40890096-1ecf-4f34-8dff-db50ab7e3596" containerName="registry-server" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.337357 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="40890096-1ecf-4f34-8dff-db50ab7e3596" containerName="registry-server" Mar 07 04:37:10 crc kubenswrapper[4689]: E0307 04:37:10.337386 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82016afa-d3dc-4bd8-ae60-db43a0960865" containerName="swift-ring-rebalance" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.337394 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="82016afa-d3dc-4bd8-ae60-db43a0960865" containerName="swift-ring-rebalance" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.337529 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="82016afa-d3dc-4bd8-ae60-db43a0960865" containerName="swift-ring-rebalance" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.337543 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="40890096-1ecf-4f34-8dff-db50ab7e3596" containerName="registry-server" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.338410 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.341375 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" event={"ID":"6e87b1f1-2509-4c3b-9c4e-c034d697f49b","Type":"ContainerStarted","Data":"45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6"} Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.341432 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" event={"ID":"6e87b1f1-2509-4c3b-9c4e-c034d697f49b","Type":"ContainerStarted","Data":"26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614"} Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.341601 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.341625 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.351142 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4j8gt" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.354288 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr"] Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.386551 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" podStartSLOduration=18.386533366 podStartE2EDuration="18.386533366s" podCreationTimestamp="2026-03-07 04:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:37:10.384107331 +0000 UTC m=+1075.430490820" watchObservedRunningTime="2026-03-07 04:37:10.386533366 +0000 UTC m=+1075.432916855" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.463412 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f02cb0ce-c569-4668-bc73-142e3340935f-bundle\") pod \"ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr\" (UID: \"f02cb0ce-c569-4668-bc73-142e3340935f\") " pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.463479 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f02cb0ce-c569-4668-bc73-142e3340935f-util\") pod \"ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr\" (UID: \"f02cb0ce-c569-4668-bc73-142e3340935f\") " pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.463659 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qx6\" (UniqueName: \"kubernetes.io/projected/f02cb0ce-c569-4668-bc73-142e3340935f-kube-api-access-m5qx6\") pod \"ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr\" (UID: \"f02cb0ce-c569-4668-bc73-142e3340935f\") " pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.565627 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f02cb0ce-c569-4668-bc73-142e3340935f-util\") pod \"ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr\" (UID: \"f02cb0ce-c569-4668-bc73-142e3340935f\") " pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.565989 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f02cb0ce-c569-4668-bc73-142e3340935f-bundle\") pod \"ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr\" (UID: \"f02cb0ce-c569-4668-bc73-142e3340935f\") " pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.566155 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qx6\" (UniqueName: \"kubernetes.io/projected/f02cb0ce-c569-4668-bc73-142e3340935f-kube-api-access-m5qx6\") pod \"ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr\" (UID: \"f02cb0ce-c569-4668-bc73-142e3340935f\") " pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.566652 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f02cb0ce-c569-4668-bc73-142e3340935f-util\") pod \"ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr\" (UID: \"f02cb0ce-c569-4668-bc73-142e3340935f\") " pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.566889 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f02cb0ce-c569-4668-bc73-142e3340935f-bundle\") pod \"ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr\" (UID: \"f02cb0ce-c569-4668-bc73-142e3340935f\") " pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.589627 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qx6\" (UniqueName: \"kubernetes.io/projected/f02cb0ce-c569-4668-bc73-142e3340935f-kube-api-access-m5qx6\") pod \"ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr\" (UID: \"f02cb0ce-c569-4668-bc73-142e3340935f\") " pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:10 crc kubenswrapper[4689]: I0307 04:37:10.655834 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:11 crc kubenswrapper[4689]: I0307 04:37:11.258908 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr"] Mar 07 04:37:11 crc kubenswrapper[4689]: W0307 04:37:11.264226 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf02cb0ce_c569_4668_bc73_142e3340935f.slice/crio-65b24fcfeaf8a4683a63fece5313c032fb64b9feddc4fd728d625dfbaae29540 WatchSource:0}: Error finding container 65b24fcfeaf8a4683a63fece5313c032fb64b9feddc4fd728d625dfbaae29540: Status 404 returned error can't find the container with id 65b24fcfeaf8a4683a63fece5313c032fb64b9feddc4fd728d625dfbaae29540 Mar 07 04:37:11 crc kubenswrapper[4689]: I0307 04:37:11.350384 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"1b3960e36d0b90b01c78ef9cdfc8857c059e03d4fc0b35cebbdbda9d25c2e743"} Mar 07 04:37:11 crc kubenswrapper[4689]: I0307 04:37:11.350423 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"1ce86596f91d66453e465f97afa8624aca1b2b8b2d59d3a5f990349cc84881ae"} Mar 07 04:37:11 crc kubenswrapper[4689]: I0307 04:37:11.351418 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" event={"ID":"f02cb0ce-c569-4668-bc73-142e3340935f","Type":"ContainerStarted","Data":"65b24fcfeaf8a4683a63fece5313c032fb64b9feddc4fd728d625dfbaae29540"} Mar 07 04:37:12 crc kubenswrapper[4689]: I0307 04:37:12.362060 4689 generic.go:334] "Generic (PLEG): container finished" podID="f02cb0ce-c569-4668-bc73-142e3340935f" containerID="a93bf6538071e1db6db606ad71151582e0647e5a78cac50834314780bfbd763c" exitCode=0 Mar 07 04:37:12 crc kubenswrapper[4689]: I0307 04:37:12.362118 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" event={"ID":"f02cb0ce-c569-4668-bc73-142e3340935f","Type":"ContainerDied","Data":"a93bf6538071e1db6db606ad71151582e0647e5a78cac50834314780bfbd763c"} Mar 07 04:37:12 crc kubenswrapper[4689]: I0307 04:37:12.373613 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"df2b64bed9e2330912063f36cf4cceb10965d467dee93369db2730c1e257474e"} Mar 07 04:37:12 crc kubenswrapper[4689]: I0307 04:37:12.373681 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"faad358fd307a99964689f91a5acb7e967ffb6178743b7f718e092bf976a7e8d"} Mar 07 04:37:13 crc kubenswrapper[4689]: I0307 04:37:13.385902 4689 generic.go:334] "Generic (PLEG): container finished" podID="f02cb0ce-c569-4668-bc73-142e3340935f" containerID="88b647131cc1a5f8296c7683f24cdf754aa521e804e9978b5aa0d333118b0b93" exitCode=0 Mar 07 04:37:13 crc kubenswrapper[4689]: I0307 04:37:13.386017 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" event={"ID":"f02cb0ce-c569-4668-bc73-142e3340935f","Type":"ContainerDied","Data":"88b647131cc1a5f8296c7683f24cdf754aa521e804e9978b5aa0d333118b0b93"} Mar 07 04:37:13 crc kubenswrapper[4689]: I0307 04:37:13.395222 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"d0fa234be29bc574f8e8ac0e9059a71a0665d50a4a5e8587b656627fe358168d"} Mar 07 04:37:13 crc kubenswrapper[4689]: I0307 04:37:13.395267 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"fcc4fd1908f707c3a9b6e85c0ed1a296725aeb2ace62d4136b7cfdf7e4793cb9"} Mar 07 04:37:13 crc kubenswrapper[4689]: I0307 04:37:13.395282 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"838f0a0e47edc581a3586403409c00fe391e1a0446670e6c3dae72a34453a3a6"} Mar 07 04:37:13 crc kubenswrapper[4689]: I0307 04:37:13.395293 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"c9115983fb96eb604ca6eee60e5a2764c938cbe715a4b97fa1cffc9f4cfcf61f"} Mar 07 04:37:14 crc kubenswrapper[4689]: I0307 04:37:14.407324 4689 generic.go:334] "Generic (PLEG): container finished" podID="f02cb0ce-c569-4668-bc73-142e3340935f" containerID="acab0757368f55b51224f719dbdc35614a84224ef9dbaec25cc43618dc55b963" exitCode=0 Mar 07 04:37:14 crc kubenswrapper[4689]: I0307 04:37:14.407369 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" event={"ID":"f02cb0ce-c569-4668-bc73-142e3340935f","Type":"ContainerDied","Data":"acab0757368f55b51224f719dbdc35614a84224ef9dbaec25cc43618dc55b963"} Mar 07 04:37:14 crc kubenswrapper[4689]: I0307 04:37:14.415766 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"af3a3e3771dc5dcb25112f7477a92fb7553646ad838f63f1cf844231472aa223"} Mar 07 04:37:14 crc kubenswrapper[4689]: I0307 04:37:14.415816 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"a460a955db79fbf911a926367b117b7f6ceb0c5df6dbccaeddba0833bd8d1785"} Mar 07 04:37:14 crc kubenswrapper[4689]: I0307 04:37:14.415829 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerStarted","Data":"1f89449ad1a80fea4286fef990935e960507f7bf84f0d843d58e0d743c6402d3"} Mar 07 04:37:14 crc kubenswrapper[4689]: I0307 04:37:14.465050 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-storage-0" podStartSLOduration=19.48182896 podStartE2EDuration="24.465026964s" podCreationTimestamp="2026-03-07 04:36:50 +0000 UTC" firstStartedPulling="2026-03-07 04:37:07.604828989 +0000 UTC m=+1072.651212478" lastFinishedPulling="2026-03-07 04:37:12.588026953 +0000 UTC m=+1077.634410482" observedRunningTime="2026-03-07 04:37:14.461058767 +0000 UTC m=+1079.507442296" watchObservedRunningTime="2026-03-07 04:37:14.465026964 +0000 UTC m=+1079.511410473" Mar 07 04:37:15 crc kubenswrapper[4689]: I0307 04:37:15.883050 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:16 crc kubenswrapper[4689]: I0307 04:37:16.046685 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f02cb0ce-c569-4668-bc73-142e3340935f-bundle\") pod \"f02cb0ce-c569-4668-bc73-142e3340935f\" (UID: \"f02cb0ce-c569-4668-bc73-142e3340935f\") " Mar 07 04:37:16 crc kubenswrapper[4689]: I0307 04:37:16.046779 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5qx6\" (UniqueName: \"kubernetes.io/projected/f02cb0ce-c569-4668-bc73-142e3340935f-kube-api-access-m5qx6\") pod \"f02cb0ce-c569-4668-bc73-142e3340935f\" (UID: \"f02cb0ce-c569-4668-bc73-142e3340935f\") " Mar 07 04:37:16 crc kubenswrapper[4689]: I0307 04:37:16.046869 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f02cb0ce-c569-4668-bc73-142e3340935f-util\") pod \"f02cb0ce-c569-4668-bc73-142e3340935f\" (UID: \"f02cb0ce-c569-4668-bc73-142e3340935f\") " Mar 07 04:37:16 crc kubenswrapper[4689]: I0307 04:37:16.047956 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f02cb0ce-c569-4668-bc73-142e3340935f-bundle" (OuterVolumeSpecName: "bundle") pod "f02cb0ce-c569-4668-bc73-142e3340935f" (UID: "f02cb0ce-c569-4668-bc73-142e3340935f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:37:16 crc kubenswrapper[4689]: I0307 04:37:16.055736 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f02cb0ce-c569-4668-bc73-142e3340935f-kube-api-access-m5qx6" (OuterVolumeSpecName: "kube-api-access-m5qx6") pod "f02cb0ce-c569-4668-bc73-142e3340935f" (UID: "f02cb0ce-c569-4668-bc73-142e3340935f"). InnerVolumeSpecName "kube-api-access-m5qx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:37:16 crc kubenswrapper[4689]: I0307 04:37:16.061063 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f02cb0ce-c569-4668-bc73-142e3340935f-util" (OuterVolumeSpecName: "util") pod "f02cb0ce-c569-4668-bc73-142e3340935f" (UID: "f02cb0ce-c569-4668-bc73-142e3340935f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:37:16 crc kubenswrapper[4689]: I0307 04:37:16.149344 4689 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f02cb0ce-c569-4668-bc73-142e3340935f-util\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:16 crc kubenswrapper[4689]: I0307 04:37:16.149379 4689 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f02cb0ce-c569-4668-bc73-142e3340935f-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:16 crc kubenswrapper[4689]: I0307 04:37:16.149393 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5qx6\" (UniqueName: \"kubernetes.io/projected/f02cb0ce-c569-4668-bc73-142e3340935f-kube-api-access-m5qx6\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:16 crc kubenswrapper[4689]: I0307 04:37:16.433182 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" event={"ID":"f02cb0ce-c569-4668-bc73-142e3340935f","Type":"ContainerDied","Data":"65b24fcfeaf8a4683a63fece5313c032fb64b9feddc4fd728d625dfbaae29540"} Mar 07 04:37:16 crc kubenswrapper[4689]: I0307 04:37:16.433230 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65b24fcfeaf8a4683a63fece5313c032fb64b9feddc4fd728d625dfbaae29540" Mar 07 04:37:16 crc kubenswrapper[4689]: I0307 04:37:16.433254 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr" Mar 07 04:37:18 crc kubenswrapper[4689]: I0307 04:37:18.806775 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:37:18 crc kubenswrapper[4689]: I0307 04:37:18.812098 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.211590 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn"] Mar 07 04:37:27 crc kubenswrapper[4689]: E0307 04:37:27.212386 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02cb0ce-c569-4668-bc73-142e3340935f" containerName="extract" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.212404 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02cb0ce-c569-4668-bc73-142e3340935f" containerName="extract" Mar 07 04:37:27 crc kubenswrapper[4689]: E0307 04:37:27.212419 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02cb0ce-c569-4668-bc73-142e3340935f" containerName="util" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.212427 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02cb0ce-c569-4668-bc73-142e3340935f" containerName="util" Mar 07 04:37:27 crc kubenswrapper[4689]: E0307 04:37:27.212447 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02cb0ce-c569-4668-bc73-142e3340935f" containerName="pull" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.212454 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02cb0ce-c569-4668-bc73-142e3340935f" containerName="pull" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.212593 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f02cb0ce-c569-4668-bc73-142e3340935f" containerName="extract" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.213092 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.214892 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.216078 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-n4v8p" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.224470 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn"] Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.313689 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwt8t\" (UniqueName: \"kubernetes.io/projected/c5573c74-db15-40d3-9e5a-fa66061ec3bb-kube-api-access-wwt8t\") pod \"glance-operator-controller-manager-5bd67dfbcc-6grpn\" (UID: \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\") " pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.313765 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5573c74-db15-40d3-9e5a-fa66061ec3bb-apiservice-cert\") pod \"glance-operator-controller-manager-5bd67dfbcc-6grpn\" (UID: \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\") " pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.313803 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5573c74-db15-40d3-9e5a-fa66061ec3bb-webhook-cert\") pod \"glance-operator-controller-manager-5bd67dfbcc-6grpn\" (UID: \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\") " pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.416030 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwt8t\" (UniqueName: \"kubernetes.io/projected/c5573c74-db15-40d3-9e5a-fa66061ec3bb-kube-api-access-wwt8t\") pod \"glance-operator-controller-manager-5bd67dfbcc-6grpn\" (UID: \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\") " pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.416099 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5573c74-db15-40d3-9e5a-fa66061ec3bb-apiservice-cert\") pod \"glance-operator-controller-manager-5bd67dfbcc-6grpn\" (UID: \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\") " pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.416134 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5573c74-db15-40d3-9e5a-fa66061ec3bb-webhook-cert\") pod \"glance-operator-controller-manager-5bd67dfbcc-6grpn\" (UID: \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\") " pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.422305 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5573c74-db15-40d3-9e5a-fa66061ec3bb-apiservice-cert\") pod \"glance-operator-controller-manager-5bd67dfbcc-6grpn\" (UID: \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\") " pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.422971 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5573c74-db15-40d3-9e5a-fa66061ec3bb-webhook-cert\") pod \"glance-operator-controller-manager-5bd67dfbcc-6grpn\" (UID: \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\") " pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.433336 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwt8t\" (UniqueName: \"kubernetes.io/projected/c5573c74-db15-40d3-9e5a-fa66061ec3bb-kube-api-access-wwt8t\") pod \"glance-operator-controller-manager-5bd67dfbcc-6grpn\" (UID: \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\") " pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:27 crc kubenswrapper[4689]: I0307 04:37:27.530029 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:28 crc kubenswrapper[4689]: I0307 04:37:28.077484 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn"] Mar 07 04:37:28 crc kubenswrapper[4689]: I0307 04:37:28.574015 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" event={"ID":"c5573c74-db15-40d3-9e5a-fa66061ec3bb","Type":"ContainerStarted","Data":"4abee1e9515cf4aa783dd0bd5e8bb2632b616a4cb74614e22db148428e48f3aa"} Mar 07 04:37:30 crc kubenswrapper[4689]: I0307 04:37:30.591064 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" event={"ID":"c5573c74-db15-40d3-9e5a-fa66061ec3bb","Type":"ContainerStarted","Data":"0e3de09508b036f7aff417affb92d290088f4676c282117a5adf1d8b786704a1"} Mar 07 04:37:30 crc kubenswrapper[4689]: I0307 04:37:30.591700 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:30 crc kubenswrapper[4689]: I0307 04:37:30.610782 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" podStartSLOduration=2.07081492 podStartE2EDuration="3.610764578s" podCreationTimestamp="2026-03-07 04:37:27 +0000 UTC" firstStartedPulling="2026-03-07 04:37:28.094122377 +0000 UTC m=+1093.140505866" lastFinishedPulling="2026-03-07 04:37:29.634072045 +0000 UTC m=+1094.680455524" observedRunningTime="2026-03-07 04:37:30.605121736 +0000 UTC m=+1095.651505225" watchObservedRunningTime="2026-03-07 04:37:30.610764578 +0000 UTC m=+1095.657148067" Mar 07 04:37:37 crc kubenswrapper[4689]: I0307 04:37:37.536119 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.086657 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.087859 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.089582 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.090054 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.090345 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-6k9k4c8bfg" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.091903 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-q4kfd" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.094785 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.109177 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg7nc\" (UniqueName: \"kubernetes.io/projected/4f8b0c10-1830-4a35-b5d7-a5f00a990965-kube-api-access-sg7nc\") pod \"openstackclient\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.109422 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-scripts\") pod \"openstackclient\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.109554 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-config-secret\") pod \"openstackclient\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.109622 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-config\") pod \"openstackclient\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.210703 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-scripts\") pod \"openstackclient\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.210767 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-config-secret\") pod \"openstackclient\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.210796 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-config\") pod \"openstackclient\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.210817 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg7nc\" (UniqueName: \"kubernetes.io/projected/4f8b0c10-1830-4a35-b5d7-a5f00a990965-kube-api-access-sg7nc\") pod \"openstackclient\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.212086 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-config\") pod \"openstackclient\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.212130 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-scripts\") pod \"openstackclient\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.226981 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-config-secret\") pod \"openstackclient\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.231835 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg7nc\" (UniqueName: \"kubernetes.io/projected/4f8b0c10-1830-4a35-b5d7-a5f00a990965-kube-api-access-sg7nc\") pod \"openstackclient\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.404258 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Mar 07 04:37:40 crc kubenswrapper[4689]: I0307 04:37:40.865739 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Mar 07 04:37:41 crc kubenswrapper[4689]: I0307 04:37:41.694051 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"4f8b0c10-1830-4a35-b5d7-a5f00a990965","Type":"ContainerStarted","Data":"1598d143917e920cce41aba148e697a3691ad0d35823fc203d7e8ea6d38e8215"} Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.831327 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-lrppg"] Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.832782 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-lrppg" Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.837501 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-1932-account-create-update-vqggs"] Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.838237 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.839818 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.843695 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-lrppg"] Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.873790 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-1932-account-create-update-vqggs"] Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.891689 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf61f191-6961-4d76-bf10-2a6fad17cab5-operator-scripts\") pod \"glance-1932-account-create-update-vqggs\" (UID: \"bf61f191-6961-4d76-bf10-2a6fad17cab5\") " pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.891745 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfrjn\" (UniqueName: \"kubernetes.io/projected/377abb24-c403-4bc4-96c6-904786cddd96-kube-api-access-gfrjn\") pod \"glance-db-create-lrppg\" (UID: \"377abb24-c403-4bc4-96c6-904786cddd96\") " pod="glance-kuttl-tests/glance-db-create-lrppg" Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.891764 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/377abb24-c403-4bc4-96c6-904786cddd96-operator-scripts\") pod \"glance-db-create-lrppg\" (UID: \"377abb24-c403-4bc4-96c6-904786cddd96\") " pod="glance-kuttl-tests/glance-db-create-lrppg" Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.891785 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk9cd\" (UniqueName: \"kubernetes.io/projected/bf61f191-6961-4d76-bf10-2a6fad17cab5-kube-api-access-sk9cd\") pod \"glance-1932-account-create-update-vqggs\" (UID: \"bf61f191-6961-4d76-bf10-2a6fad17cab5\") " pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.993433 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf61f191-6961-4d76-bf10-2a6fad17cab5-operator-scripts\") pod \"glance-1932-account-create-update-vqggs\" (UID: \"bf61f191-6961-4d76-bf10-2a6fad17cab5\") " pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.993522 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfrjn\" (UniqueName: \"kubernetes.io/projected/377abb24-c403-4bc4-96c6-904786cddd96-kube-api-access-gfrjn\") pod \"glance-db-create-lrppg\" (UID: \"377abb24-c403-4bc4-96c6-904786cddd96\") " pod="glance-kuttl-tests/glance-db-create-lrppg" Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.993547 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/377abb24-c403-4bc4-96c6-904786cddd96-operator-scripts\") pod \"glance-db-create-lrppg\" (UID: \"377abb24-c403-4bc4-96c6-904786cddd96\") " pod="glance-kuttl-tests/glance-db-create-lrppg" Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.993574 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk9cd\" (UniqueName: \"kubernetes.io/projected/bf61f191-6961-4d76-bf10-2a6fad17cab5-kube-api-access-sk9cd\") pod \"glance-1932-account-create-update-vqggs\" (UID: \"bf61f191-6961-4d76-bf10-2a6fad17cab5\") " pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.994584 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/377abb24-c403-4bc4-96c6-904786cddd96-operator-scripts\") pod \"glance-db-create-lrppg\" (UID: \"377abb24-c403-4bc4-96c6-904786cddd96\") " pod="glance-kuttl-tests/glance-db-create-lrppg" Mar 07 04:37:44 crc kubenswrapper[4689]: I0307 04:37:44.994766 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf61f191-6961-4d76-bf10-2a6fad17cab5-operator-scripts\") pod \"glance-1932-account-create-update-vqggs\" (UID: \"bf61f191-6961-4d76-bf10-2a6fad17cab5\") " pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" Mar 07 04:37:45 crc kubenswrapper[4689]: I0307 04:37:45.014493 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfrjn\" (UniqueName: \"kubernetes.io/projected/377abb24-c403-4bc4-96c6-904786cddd96-kube-api-access-gfrjn\") pod \"glance-db-create-lrppg\" (UID: \"377abb24-c403-4bc4-96c6-904786cddd96\") " pod="glance-kuttl-tests/glance-db-create-lrppg" Mar 07 04:37:45 crc kubenswrapper[4689]: I0307 04:37:45.016375 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk9cd\" (UniqueName: \"kubernetes.io/projected/bf61f191-6961-4d76-bf10-2a6fad17cab5-kube-api-access-sk9cd\") pod \"glance-1932-account-create-update-vqggs\" (UID: \"bf61f191-6961-4d76-bf10-2a6fad17cab5\") " pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" Mar 07 04:37:45 crc kubenswrapper[4689]: I0307 04:37:45.164772 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-lrppg" Mar 07 04:37:45 crc kubenswrapper[4689]: I0307 04:37:45.180727 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" Mar 07 04:37:48 crc kubenswrapper[4689]: I0307 04:37:48.745583 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"4f8b0c10-1830-4a35-b5d7-a5f00a990965","Type":"ContainerStarted","Data":"b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82"} Mar 07 04:37:48 crc kubenswrapper[4689]: I0307 04:37:48.958006 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-1932-account-create-update-vqggs"] Mar 07 04:37:48 crc kubenswrapper[4689]: W0307 04:37:48.965865 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf61f191_6961_4d76_bf10_2a6fad17cab5.slice/crio-bfa2cc9c37faa55940bf1efb0894f07796163d91896ced3df589c20285772ad1 WatchSource:0}: Error finding container bfa2cc9c37faa55940bf1efb0894f07796163d91896ced3df589c20285772ad1: Status 404 returned error can't find the container with id bfa2cc9c37faa55940bf1efb0894f07796163d91896ced3df589c20285772ad1 Mar 07 04:37:49 crc kubenswrapper[4689]: I0307 04:37:49.015062 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-lrppg"] Mar 07 04:37:49 crc kubenswrapper[4689]: W0307 04:37:49.020853 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod377abb24_c403_4bc4_96c6_904786cddd96.slice/crio-435572215bf9cb91b188276c9196ba6ed70cefa602ce159817d96019be6048aa WatchSource:0}: Error finding container 435572215bf9cb91b188276c9196ba6ed70cefa602ce159817d96019be6048aa: Status 404 returned error can't find the container with id 435572215bf9cb91b188276c9196ba6ed70cefa602ce159817d96019be6048aa Mar 07 04:37:49 crc kubenswrapper[4689]: I0307 04:37:49.756804 4689 generic.go:334] "Generic (PLEG): container finished" podID="bf61f191-6961-4d76-bf10-2a6fad17cab5" containerID="f698e75ad978cfba6d54756f60d78533f9e190c43a21361e46c8f523e46705bd" exitCode=0 Mar 07 04:37:49 crc kubenswrapper[4689]: I0307 04:37:49.756870 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" event={"ID":"bf61f191-6961-4d76-bf10-2a6fad17cab5","Type":"ContainerDied","Data":"f698e75ad978cfba6d54756f60d78533f9e190c43a21361e46c8f523e46705bd"} Mar 07 04:37:49 crc kubenswrapper[4689]: I0307 04:37:49.757210 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" event={"ID":"bf61f191-6961-4d76-bf10-2a6fad17cab5","Type":"ContainerStarted","Data":"bfa2cc9c37faa55940bf1efb0894f07796163d91896ced3df589c20285772ad1"} Mar 07 04:37:49 crc kubenswrapper[4689]: I0307 04:37:49.760098 4689 generic.go:334] "Generic (PLEG): container finished" podID="377abb24-c403-4bc4-96c6-904786cddd96" containerID="f4786452fdcaa7ad2115ba9bb4423fe6f9cdce0e73e522aa6b1b0cd9da170b25" exitCode=0 Mar 07 04:37:49 crc kubenswrapper[4689]: I0307 04:37:49.760153 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-lrppg" event={"ID":"377abb24-c403-4bc4-96c6-904786cddd96","Type":"ContainerDied","Data":"f4786452fdcaa7ad2115ba9bb4423fe6f9cdce0e73e522aa6b1b0cd9da170b25"} Mar 07 04:37:49 crc kubenswrapper[4689]: I0307 04:37:49.760211 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-lrppg" event={"ID":"377abb24-c403-4bc4-96c6-904786cddd96","Type":"ContainerStarted","Data":"435572215bf9cb91b188276c9196ba6ed70cefa602ce159817d96019be6048aa"} Mar 07 04:37:49 crc kubenswrapper[4689]: I0307 04:37:49.811712 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=2.116560009 podStartE2EDuration="9.811683554s" podCreationTimestamp="2026-03-07 04:37:40 +0000 UTC" firstStartedPulling="2026-03-07 04:37:40.862777275 +0000 UTC m=+1105.909160764" lastFinishedPulling="2026-03-07 04:37:48.5579008 +0000 UTC m=+1113.604284309" observedRunningTime="2026-03-07 04:37:49.806114494 +0000 UTC m=+1114.852498023" watchObservedRunningTime="2026-03-07 04:37:49.811683554 +0000 UTC m=+1114.858067073" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.284200 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.292121 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-lrppg" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.392525 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf61f191-6961-4d76-bf10-2a6fad17cab5-operator-scripts\") pod \"bf61f191-6961-4d76-bf10-2a6fad17cab5\" (UID: \"bf61f191-6961-4d76-bf10-2a6fad17cab5\") " Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.392894 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk9cd\" (UniqueName: \"kubernetes.io/projected/bf61f191-6961-4d76-bf10-2a6fad17cab5-kube-api-access-sk9cd\") pod \"bf61f191-6961-4d76-bf10-2a6fad17cab5\" (UID: \"bf61f191-6961-4d76-bf10-2a6fad17cab5\") " Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.393393 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf61f191-6961-4d76-bf10-2a6fad17cab5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf61f191-6961-4d76-bf10-2a6fad17cab5" (UID: "bf61f191-6961-4d76-bf10-2a6fad17cab5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.403163 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf61f191-6961-4d76-bf10-2a6fad17cab5-kube-api-access-sk9cd" (OuterVolumeSpecName: "kube-api-access-sk9cd") pod "bf61f191-6961-4d76-bf10-2a6fad17cab5" (UID: "bf61f191-6961-4d76-bf10-2a6fad17cab5"). InnerVolumeSpecName "kube-api-access-sk9cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.494193 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfrjn\" (UniqueName: \"kubernetes.io/projected/377abb24-c403-4bc4-96c6-904786cddd96-kube-api-access-gfrjn\") pod \"377abb24-c403-4bc4-96c6-904786cddd96\" (UID: \"377abb24-c403-4bc4-96c6-904786cddd96\") " Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.494372 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/377abb24-c403-4bc4-96c6-904786cddd96-operator-scripts\") pod \"377abb24-c403-4bc4-96c6-904786cddd96\" (UID: \"377abb24-c403-4bc4-96c6-904786cddd96\") " Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.494855 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk9cd\" (UniqueName: \"kubernetes.io/projected/bf61f191-6961-4d76-bf10-2a6fad17cab5-kube-api-access-sk9cd\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.494891 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf61f191-6961-4d76-bf10-2a6fad17cab5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.495384 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377abb24-c403-4bc4-96c6-904786cddd96-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "377abb24-c403-4bc4-96c6-904786cddd96" (UID: "377abb24-c403-4bc4-96c6-904786cddd96"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.503446 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377abb24-c403-4bc4-96c6-904786cddd96-kube-api-access-gfrjn" (OuterVolumeSpecName: "kube-api-access-gfrjn") pod "377abb24-c403-4bc4-96c6-904786cddd96" (UID: "377abb24-c403-4bc4-96c6-904786cddd96"). InnerVolumeSpecName "kube-api-access-gfrjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.596139 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfrjn\" (UniqueName: \"kubernetes.io/projected/377abb24-c403-4bc4-96c6-904786cddd96-kube-api-access-gfrjn\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.596908 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/377abb24-c403-4bc4-96c6-904786cddd96-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.780191 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-lrppg" event={"ID":"377abb24-c403-4bc4-96c6-904786cddd96","Type":"ContainerDied","Data":"435572215bf9cb91b188276c9196ba6ed70cefa602ce159817d96019be6048aa"} Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.780230 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="435572215bf9cb91b188276c9196ba6ed70cefa602ce159817d96019be6048aa" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.780294 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-lrppg" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.788118 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" event={"ID":"bf61f191-6961-4d76-bf10-2a6fad17cab5","Type":"ContainerDied","Data":"bfa2cc9c37faa55940bf1efb0894f07796163d91896ced3df589c20285772ad1"} Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.788196 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfa2cc9c37faa55940bf1efb0894f07796163d91896ced3df589c20285772ad1" Mar 07 04:37:51 crc kubenswrapper[4689]: I0307 04:37:51.788208 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-1932-account-create-update-vqggs" Mar 07 04:37:54 crc kubenswrapper[4689]: I0307 04:37:54.991439 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-6vnq9"] Mar 07 04:37:54 crc kubenswrapper[4689]: E0307 04:37:54.992289 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf61f191-6961-4d76-bf10-2a6fad17cab5" containerName="mariadb-account-create-update" Mar 07 04:37:54 crc kubenswrapper[4689]: I0307 04:37:54.992313 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf61f191-6961-4d76-bf10-2a6fad17cab5" containerName="mariadb-account-create-update" Mar 07 04:37:54 crc kubenswrapper[4689]: E0307 04:37:54.992333 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377abb24-c403-4bc4-96c6-904786cddd96" containerName="mariadb-database-create" Mar 07 04:37:54 crc kubenswrapper[4689]: I0307 04:37:54.992342 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="377abb24-c403-4bc4-96c6-904786cddd96" containerName="mariadb-database-create" Mar 07 04:37:54 crc kubenswrapper[4689]: I0307 04:37:54.992564 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="377abb24-c403-4bc4-96c6-904786cddd96" containerName="mariadb-database-create" Mar 07 04:37:54 crc kubenswrapper[4689]: I0307 04:37:54.992588 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf61f191-6961-4d76-bf10-2a6fad17cab5" containerName="mariadb-account-create-update" Mar 07 04:37:54 crc kubenswrapper[4689]: I0307 04:37:54.993326 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:37:54 crc kubenswrapper[4689]: I0307 04:37:54.996659 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-hkgvs" Mar 07 04:37:54 crc kubenswrapper[4689]: I0307 04:37:54.997606 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Mar 07 04:37:54 crc kubenswrapper[4689]: I0307 04:37:54.999543 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-6vnq9"] Mar 07 04:37:55 crc kubenswrapper[4689]: I0307 04:37:55.054268 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpkp5\" (UniqueName: \"kubernetes.io/projected/6f560ac3-026a-44d2-8b41-c0ded2d01b49-kube-api-access-cpkp5\") pod \"glance-db-sync-6vnq9\" (UID: \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\") " pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:37:55 crc kubenswrapper[4689]: I0307 04:37:55.054311 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f560ac3-026a-44d2-8b41-c0ded2d01b49-db-sync-config-data\") pod \"glance-db-sync-6vnq9\" (UID: \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\") " pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:37:55 crc kubenswrapper[4689]: I0307 04:37:55.054384 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f560ac3-026a-44d2-8b41-c0ded2d01b49-config-data\") pod \"glance-db-sync-6vnq9\" (UID: \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\") " pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:37:55 crc kubenswrapper[4689]: I0307 04:37:55.155599 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f560ac3-026a-44d2-8b41-c0ded2d01b49-config-data\") pod \"glance-db-sync-6vnq9\" (UID: \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\") " pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:37:55 crc kubenswrapper[4689]: I0307 04:37:55.155722 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpkp5\" (UniqueName: \"kubernetes.io/projected/6f560ac3-026a-44d2-8b41-c0ded2d01b49-kube-api-access-cpkp5\") pod \"glance-db-sync-6vnq9\" (UID: \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\") " pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:37:55 crc kubenswrapper[4689]: I0307 04:37:55.155760 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f560ac3-026a-44d2-8b41-c0ded2d01b49-db-sync-config-data\") pod \"glance-db-sync-6vnq9\" (UID: \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\") " pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:37:55 crc kubenswrapper[4689]: I0307 04:37:55.160818 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f560ac3-026a-44d2-8b41-c0ded2d01b49-config-data\") pod \"glance-db-sync-6vnq9\" (UID: \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\") " pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:37:55 crc kubenswrapper[4689]: I0307 04:37:55.162127 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f560ac3-026a-44d2-8b41-c0ded2d01b49-db-sync-config-data\") pod \"glance-db-sync-6vnq9\" (UID: \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\") " pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:37:55 crc kubenswrapper[4689]: I0307 04:37:55.197438 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpkp5\" (UniqueName: \"kubernetes.io/projected/6f560ac3-026a-44d2-8b41-c0ded2d01b49-kube-api-access-cpkp5\") pod \"glance-db-sync-6vnq9\" (UID: \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\") " pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:37:55 crc kubenswrapper[4689]: I0307 04:37:55.313768 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:37:55 crc kubenswrapper[4689]: I0307 04:37:55.781893 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-6vnq9"] Mar 07 04:37:55 crc kubenswrapper[4689]: W0307 04:37:55.792342 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f560ac3_026a_44d2_8b41_c0ded2d01b49.slice/crio-1df670b955afd95aae226d995166b013c68838182ea2dc1c955bb7237362ba7a WatchSource:0}: Error finding container 1df670b955afd95aae226d995166b013c68838182ea2dc1c955bb7237362ba7a: Status 404 returned error can't find the container with id 1df670b955afd95aae226d995166b013c68838182ea2dc1c955bb7237362ba7a Mar 07 04:37:55 crc kubenswrapper[4689]: I0307 04:37:55.820134 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-6vnq9" event={"ID":"6f560ac3-026a-44d2-8b41-c0ded2d01b49","Type":"ContainerStarted","Data":"1df670b955afd95aae226d995166b013c68838182ea2dc1c955bb7237362ba7a"} Mar 07 04:38:00 crc kubenswrapper[4689]: I0307 04:38:00.133794 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547638-ztptc"] Mar 07 04:38:00 crc kubenswrapper[4689]: I0307 04:38:00.137645 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547638-ztptc" Mar 07 04:38:00 crc kubenswrapper[4689]: I0307 04:38:00.140400 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:38:00 crc kubenswrapper[4689]: I0307 04:38:00.141711 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:38:00 crc kubenswrapper[4689]: I0307 04:38:00.141751 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:38:00 crc kubenswrapper[4689]: I0307 04:38:00.151062 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547638-ztptc"] Mar 07 04:38:00 crc kubenswrapper[4689]: I0307 04:38:00.279216 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8txqb\" (UniqueName: \"kubernetes.io/projected/d5dab47b-a90b-4452-9afa-73e38014b1a5-kube-api-access-8txqb\") pod \"auto-csr-approver-29547638-ztptc\" (UID: \"d5dab47b-a90b-4452-9afa-73e38014b1a5\") " pod="openshift-infra/auto-csr-approver-29547638-ztptc" Mar 07 04:38:00 crc kubenswrapper[4689]: I0307 04:38:00.380675 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8txqb\" (UniqueName: \"kubernetes.io/projected/d5dab47b-a90b-4452-9afa-73e38014b1a5-kube-api-access-8txqb\") pod \"auto-csr-approver-29547638-ztptc\" (UID: \"d5dab47b-a90b-4452-9afa-73e38014b1a5\") " pod="openshift-infra/auto-csr-approver-29547638-ztptc" Mar 07 04:38:00 crc kubenswrapper[4689]: I0307 04:38:00.404432 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8txqb\" (UniqueName: \"kubernetes.io/projected/d5dab47b-a90b-4452-9afa-73e38014b1a5-kube-api-access-8txqb\") pod \"auto-csr-approver-29547638-ztptc\" (UID: \"d5dab47b-a90b-4452-9afa-73e38014b1a5\") " pod="openshift-infra/auto-csr-approver-29547638-ztptc" Mar 07 04:38:00 crc kubenswrapper[4689]: I0307 04:38:00.458215 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547638-ztptc" Mar 07 04:38:00 crc kubenswrapper[4689]: I0307 04:38:00.971044 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547638-ztptc"] Mar 07 04:38:01 crc kubenswrapper[4689]: I0307 04:38:01.869840 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547638-ztptc" event={"ID":"d5dab47b-a90b-4452-9afa-73e38014b1a5","Type":"ContainerStarted","Data":"a6ef0a9c51eba2f9172092349d1ccd80851c5dd7ac1cd24b536bf264dce7a797"} Mar 07 04:38:08 crc kubenswrapper[4689]: I0307 04:38:08.928564 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547638-ztptc" event={"ID":"d5dab47b-a90b-4452-9afa-73e38014b1a5","Type":"ContainerStarted","Data":"0ca506acd66aa2400e68ab25a7f08f12b730eb5e8e327076583942821ce53fb6"} Mar 07 04:38:08 crc kubenswrapper[4689]: I0307 04:38:08.949964 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547638-ztptc" podStartSLOduration=1.395919117 podStartE2EDuration="8.94994324s" podCreationTimestamp="2026-03-07 04:38:00 +0000 UTC" firstStartedPulling="2026-03-07 04:38:00.977902192 +0000 UTC m=+1126.024285691" lastFinishedPulling="2026-03-07 04:38:08.531926325 +0000 UTC m=+1133.578309814" observedRunningTime="2026-03-07 04:38:08.944548475 +0000 UTC m=+1133.990931984" watchObservedRunningTime="2026-03-07 04:38:08.94994324 +0000 UTC m=+1133.996326729" Mar 07 04:38:09 crc kubenswrapper[4689]: I0307 04:38:09.944418 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547638-ztptc" event={"ID":"d5dab47b-a90b-4452-9afa-73e38014b1a5","Type":"ContainerDied","Data":"0ca506acd66aa2400e68ab25a7f08f12b730eb5e8e327076583942821ce53fb6"} Mar 07 04:38:09 crc kubenswrapper[4689]: I0307 04:38:09.944534 4689 generic.go:334] "Generic (PLEG): container finished" podID="d5dab47b-a90b-4452-9afa-73e38014b1a5" containerID="0ca506acd66aa2400e68ab25a7f08f12b730eb5e8e327076583942821ce53fb6" exitCode=0 Mar 07 04:38:09 crc kubenswrapper[4689]: I0307 04:38:09.949679 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-6vnq9" event={"ID":"6f560ac3-026a-44d2-8b41-c0ded2d01b49","Type":"ContainerStarted","Data":"8f93665d9f72c0f3b18b3ed33085c1482efa697b8936866087d3e21cda14670b"} Mar 07 04:38:09 crc kubenswrapper[4689]: I0307 04:38:09.997396 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-6vnq9" podStartSLOduration=3.20570559 podStartE2EDuration="15.997363418s" podCreationTimestamp="2026-03-07 04:37:54 +0000 UTC" firstStartedPulling="2026-03-07 04:37:55.794204805 +0000 UTC m=+1120.840588294" lastFinishedPulling="2026-03-07 04:38:08.585862613 +0000 UTC m=+1133.632246122" observedRunningTime="2026-03-07 04:38:09.994348918 +0000 UTC m=+1135.040732447" watchObservedRunningTime="2026-03-07 04:38:09.997363418 +0000 UTC m=+1135.043746947" Mar 07 04:38:11 crc kubenswrapper[4689]: I0307 04:38:11.246066 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547638-ztptc" Mar 07 04:38:11 crc kubenswrapper[4689]: I0307 04:38:11.345368 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8txqb\" (UniqueName: \"kubernetes.io/projected/d5dab47b-a90b-4452-9afa-73e38014b1a5-kube-api-access-8txqb\") pod \"d5dab47b-a90b-4452-9afa-73e38014b1a5\" (UID: \"d5dab47b-a90b-4452-9afa-73e38014b1a5\") " Mar 07 04:38:11 crc kubenswrapper[4689]: I0307 04:38:11.350742 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5dab47b-a90b-4452-9afa-73e38014b1a5-kube-api-access-8txqb" (OuterVolumeSpecName: "kube-api-access-8txqb") pod "d5dab47b-a90b-4452-9afa-73e38014b1a5" (UID: "d5dab47b-a90b-4452-9afa-73e38014b1a5"). InnerVolumeSpecName "kube-api-access-8txqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:38:11 crc kubenswrapper[4689]: I0307 04:38:11.448231 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8txqb\" (UniqueName: \"kubernetes.io/projected/d5dab47b-a90b-4452-9afa-73e38014b1a5-kube-api-access-8txqb\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:11 crc kubenswrapper[4689]: I0307 04:38:11.974722 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547638-ztptc" event={"ID":"d5dab47b-a90b-4452-9afa-73e38014b1a5","Type":"ContainerDied","Data":"a6ef0a9c51eba2f9172092349d1ccd80851c5dd7ac1cd24b536bf264dce7a797"} Mar 07 04:38:11 crc kubenswrapper[4689]: I0307 04:38:11.975050 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6ef0a9c51eba2f9172092349d1ccd80851c5dd7ac1cd24b536bf264dce7a797" Mar 07 04:38:11 crc kubenswrapper[4689]: I0307 04:38:11.974777 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547638-ztptc" Mar 07 04:38:12 crc kubenswrapper[4689]: I0307 04:38:12.012664 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547632-8mxh5"] Mar 07 04:38:12 crc kubenswrapper[4689]: I0307 04:38:12.017996 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547632-8mxh5"] Mar 07 04:38:13 crc kubenswrapper[4689]: I0307 04:38:13.837815 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3fed712-f790-4590-850a-eea2fe0e36a9" path="/var/lib/kubelet/pods/d3fed712-f790-4590-850a-eea2fe0e36a9/volumes" Mar 07 04:38:17 crc kubenswrapper[4689]: I0307 04:38:17.022598 4689 generic.go:334] "Generic (PLEG): container finished" podID="6f560ac3-026a-44d2-8b41-c0ded2d01b49" containerID="8f93665d9f72c0f3b18b3ed33085c1482efa697b8936866087d3e21cda14670b" exitCode=0 Mar 07 04:38:17 crc kubenswrapper[4689]: I0307 04:38:17.022712 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-6vnq9" event={"ID":"6f560ac3-026a-44d2-8b41-c0ded2d01b49","Type":"ContainerDied","Data":"8f93665d9f72c0f3b18b3ed33085c1482efa697b8936866087d3e21cda14670b"} Mar 07 04:38:18 crc kubenswrapper[4689]: I0307 04:38:18.352195 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:38:18 crc kubenswrapper[4689]: I0307 04:38:18.390876 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f560ac3-026a-44d2-8b41-c0ded2d01b49-db-sync-config-data\") pod \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\" (UID: \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\") " Mar 07 04:38:18 crc kubenswrapper[4689]: I0307 04:38:18.391060 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpkp5\" (UniqueName: \"kubernetes.io/projected/6f560ac3-026a-44d2-8b41-c0ded2d01b49-kube-api-access-cpkp5\") pod \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\" (UID: \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\") " Mar 07 04:38:18 crc kubenswrapper[4689]: I0307 04:38:18.391158 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f560ac3-026a-44d2-8b41-c0ded2d01b49-config-data\") pod \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\" (UID: \"6f560ac3-026a-44d2-8b41-c0ded2d01b49\") " Mar 07 04:38:18 crc kubenswrapper[4689]: I0307 04:38:18.397372 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f560ac3-026a-44d2-8b41-c0ded2d01b49-kube-api-access-cpkp5" (OuterVolumeSpecName: "kube-api-access-cpkp5") pod "6f560ac3-026a-44d2-8b41-c0ded2d01b49" (UID: "6f560ac3-026a-44d2-8b41-c0ded2d01b49"). InnerVolumeSpecName "kube-api-access-cpkp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:38:18 crc kubenswrapper[4689]: I0307 04:38:18.398316 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f560ac3-026a-44d2-8b41-c0ded2d01b49-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6f560ac3-026a-44d2-8b41-c0ded2d01b49" (UID: "6f560ac3-026a-44d2-8b41-c0ded2d01b49"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:38:18 crc kubenswrapper[4689]: I0307 04:38:18.435920 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f560ac3-026a-44d2-8b41-c0ded2d01b49-config-data" (OuterVolumeSpecName: "config-data") pod "6f560ac3-026a-44d2-8b41-c0ded2d01b49" (UID: "6f560ac3-026a-44d2-8b41-c0ded2d01b49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:38:18 crc kubenswrapper[4689]: I0307 04:38:18.493541 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f560ac3-026a-44d2-8b41-c0ded2d01b49-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:18 crc kubenswrapper[4689]: I0307 04:38:18.493597 4689 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f560ac3-026a-44d2-8b41-c0ded2d01b49-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:18 crc kubenswrapper[4689]: I0307 04:38:18.493621 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpkp5\" (UniqueName: \"kubernetes.io/projected/6f560ac3-026a-44d2-8b41-c0ded2d01b49-kube-api-access-cpkp5\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:19 crc kubenswrapper[4689]: I0307 04:38:19.045898 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-6vnq9" event={"ID":"6f560ac3-026a-44d2-8b41-c0ded2d01b49","Type":"ContainerDied","Data":"1df670b955afd95aae226d995166b013c68838182ea2dc1c955bb7237362ba7a"} Mar 07 04:38:19 crc kubenswrapper[4689]: I0307 04:38:19.045995 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1df670b955afd95aae226d995166b013c68838182ea2dc1c955bb7237362ba7a" Mar 07 04:38:19 crc kubenswrapper[4689]: I0307 04:38:19.045996 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-6vnq9" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.355828 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:38:20 crc kubenswrapper[4689]: E0307 04:38:20.357102 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5dab47b-a90b-4452-9afa-73e38014b1a5" containerName="oc" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.357199 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5dab47b-a90b-4452-9afa-73e38014b1a5" containerName="oc" Mar 07 04:38:20 crc kubenswrapper[4689]: E0307 04:38:20.357279 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f560ac3-026a-44d2-8b41-c0ded2d01b49" containerName="glance-db-sync" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.357353 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f560ac3-026a-44d2-8b41-c0ded2d01b49" containerName="glance-db-sync" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.357603 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5dab47b-a90b-4452-9afa-73e38014b1a5" containerName="oc" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.357686 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f560ac3-026a-44d2-8b41-c0ded2d01b49" containerName="glance-db-sync" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.358683 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.360607 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-hkgvs" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.360916 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.361066 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.372816 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.378459 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.379583 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.394065 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422344 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrlgb\" (UniqueName: \"kubernetes.io/projected/cb9869c2-c16b-48fd-ae88-9494cbe75728-kube-api-access-wrlgb\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422418 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb9869c2-c16b-48fd-ae88-9494cbe75728-logs\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422456 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb9869c2-c16b-48fd-ae88-9494cbe75728-httpd-run\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422475 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb9869c2-c16b-48fd-ae88-9494cbe75728-scripts\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422521 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422544 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-dev\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422570 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-run\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422590 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422609 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422627 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-lib-modules\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422655 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-sys\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422802 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-etc-nvme\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422867 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.422892 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9869c2-c16b-48fd-ae88-9494cbe75728-config-data\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.523924 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb9869c2-c16b-48fd-ae88-9494cbe75728-scripts\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524272 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524307 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-dev\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524323 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-run\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524345 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-etc-nvme\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524384 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524403 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-dev\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524449 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-dev\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524506 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-run\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524530 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-run\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524564 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524607 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524633 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524654 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-lib-modules\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524675 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d45483-6b6f-41c8-9d51-8e93c85401db-config-data\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524696 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-sys\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524713 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqzcf\" (UniqueName: \"kubernetes.io/projected/b5d45483-6b6f-41c8-9d51-8e93c85401db-kube-api-access-qqzcf\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524735 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d45483-6b6f-41c8-9d51-8e93c85401db-logs\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524752 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-etc-nvme\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524754 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524770 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524824 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9869c2-c16b-48fd-ae88-9494cbe75728-config-data\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524875 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-sys\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524904 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5d45483-6b6f-41c8-9d51-8e93c85401db-httpd-run\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524965 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.524988 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525011 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525020 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrlgb\" (UniqueName: \"kubernetes.io/projected/cb9869c2-c16b-48fd-ae88-9494cbe75728-kube-api-access-wrlgb\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525069 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5d45483-6b6f-41c8-9d51-8e93c85401db-scripts\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525099 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525186 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb9869c2-c16b-48fd-ae88-9494cbe75728-logs\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525014 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525264 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-lib-modules\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525315 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-sys\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525415 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-etc-nvme\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525607 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb9869c2-c16b-48fd-ae88-9494cbe75728-logs\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525637 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-lib-modules\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525677 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb9869c2-c16b-48fd-ae88-9494cbe75728-httpd-run\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.525961 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb9869c2-c16b-48fd-ae88-9494cbe75728-httpd-run\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.529786 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb9869c2-c16b-48fd-ae88-9494cbe75728-scripts\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.530200 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9869c2-c16b-48fd-ae88-9494cbe75728-config-data\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.545000 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.553724 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.555511 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrlgb\" (UniqueName: \"kubernetes.io/projected/cb9869c2-c16b-48fd-ae88-9494cbe75728-kube-api-access-wrlgb\") pod \"glance-default-single-0\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627406 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d45483-6b6f-41c8-9d51-8e93c85401db-config-data\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627455 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqzcf\" (UniqueName: \"kubernetes.io/projected/b5d45483-6b6f-41c8-9d51-8e93c85401db-kube-api-access-qqzcf\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627477 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d45483-6b6f-41c8-9d51-8e93c85401db-logs\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627504 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-sys\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627519 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5d45483-6b6f-41c8-9d51-8e93c85401db-httpd-run\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627546 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627561 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627581 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5d45483-6b6f-41c8-9d51-8e93c85401db-scripts\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627596 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627608 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-sys\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627676 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-lib-modules\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627631 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-lib-modules\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627733 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627747 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627769 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-dev\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627798 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-run\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627835 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-etc-nvme\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.627891 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.628007 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d45483-6b6f-41c8-9d51-8e93c85401db-logs\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.628068 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5d45483-6b6f-41c8-9d51-8e93c85401db-httpd-run\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.628153 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.628201 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.628224 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-run\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.628240 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-dev\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.628263 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-etc-nvme\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.631850 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5d45483-6b6f-41c8-9d51-8e93c85401db-scripts\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.640274 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d45483-6b6f-41c8-9d51-8e93c85401db-config-data\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.654894 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqzcf\" (UniqueName: \"kubernetes.io/projected/b5d45483-6b6f-41c8-9d51-8e93c85401db-kube-api-access-qqzcf\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.661741 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.662097 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.674916 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.699592 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:20 crc kubenswrapper[4689]: I0307 04:38:20.746080 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:38:26 crc kubenswrapper[4689]: I0307 04:38:26.127322 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:38:26 crc kubenswrapper[4689]: W0307 04:38:26.129620 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb9869c2_c16b_48fd_ae88_9494cbe75728.slice/crio-9074dfc4cfbb9c80467a4a2f6cb69901a398b145d118475fc5f9f4f219c8485b WatchSource:0}: Error finding container 9074dfc4cfbb9c80467a4a2f6cb69901a398b145d118475fc5f9f4f219c8485b: Status 404 returned error can't find the container with id 9074dfc4cfbb9c80467a4a2f6cb69901a398b145d118475fc5f9f4f219c8485b Mar 07 04:38:26 crc kubenswrapper[4689]: I0307 04:38:26.149281 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.119933 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cb9869c2-c16b-48fd-ae88-9494cbe75728","Type":"ContainerStarted","Data":"167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598"} Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.120688 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cb9869c2-c16b-48fd-ae88-9494cbe75728","Type":"ContainerStarted","Data":"b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442"} Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.120714 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cb9869c2-c16b-48fd-ae88-9494cbe75728","Type":"ContainerStarted","Data":"9074dfc4cfbb9c80467a4a2f6cb69901a398b145d118475fc5f9f4f219c8485b"} Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.131669 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b5d45483-6b6f-41c8-9d51-8e93c85401db","Type":"ContainerStarted","Data":"c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f"} Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.131725 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b5d45483-6b6f-41c8-9d51-8e93c85401db","Type":"ContainerStarted","Data":"eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a"} Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.131749 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b5d45483-6b6f-41c8-9d51-8e93c85401db","Type":"ContainerStarted","Data":"1c1eca631f1f94ae536249793be6b88cae935d1db59d33865610ee54131426d4"} Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.132084 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="b5d45483-6b6f-41c8-9d51-8e93c85401db" containerName="glance-log" containerID="cri-o://eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a" gracePeriod=30 Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.132404 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="b5d45483-6b6f-41c8-9d51-8e93c85401db" containerName="glance-httpd" containerID="cri-o://c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f" gracePeriod=30 Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.190742 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=8.190714619 podStartE2EDuration="8.190714619s" podCreationTimestamp="2026-03-07 04:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:38:27.158457784 +0000 UTC m=+1152.204841283" watchObservedRunningTime="2026-03-07 04:38:27.190714619 +0000 UTC m=+1152.237098148" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.193672 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=8.193656958 podStartE2EDuration="8.193656958s" podCreationTimestamp="2026-03-07 04:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:38:27.183418503 +0000 UTC m=+1152.229802022" watchObservedRunningTime="2026-03-07 04:38:27.193656958 +0000 UTC m=+1152.240040487" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.637340 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.763893 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5d45483-6b6f-41c8-9d51-8e93c85401db-scripts\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.764030 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-etc-iscsi\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.764137 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.764309 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-lib-modules\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.764446 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.764377 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.764540 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d45483-6b6f-41c8-9d51-8e93c85401db-config-data\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.764996 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-run\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765042 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5d45483-6b6f-41c8-9d51-8e93c85401db-httpd-run\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765084 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqzcf\" (UniqueName: \"kubernetes.io/projected/b5d45483-6b6f-41c8-9d51-8e93c85401db-kube-api-access-qqzcf\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765105 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-run" (OuterVolumeSpecName: "run") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765124 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-dev\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765157 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-etc-nvme\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765208 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765268 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-sys\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765296 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d45483-6b6f-41c8-9d51-8e93c85401db-logs\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765331 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-var-locks-brick\") pod \"b5d45483-6b6f-41c8-9d51-8e93c85401db\" (UID: \"b5d45483-6b6f-41c8-9d51-8e93c85401db\") " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765658 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d45483-6b6f-41c8-9d51-8e93c85401db-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765778 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765815 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-sys" (OuterVolumeSpecName: "sys") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.765996 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.766037 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.766061 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5d45483-6b6f-41c8-9d51-8e93c85401db-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.766079 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.766096 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.766114 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.766201 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-dev" (OuterVolumeSpecName: "dev") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.766210 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d45483-6b6f-41c8-9d51-8e93c85401db-logs" (OuterVolumeSpecName: "logs") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.766250 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.769115 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d45483-6b6f-41c8-9d51-8e93c85401db-scripts" (OuterVolumeSpecName: "scripts") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.769282 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5d45483-6b6f-41c8-9d51-8e93c85401db-kube-api-access-qqzcf" (OuterVolumeSpecName: "kube-api-access-qqzcf") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "kube-api-access-qqzcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.769289 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.770600 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.842038 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d45483-6b6f-41c8-9d51-8e93c85401db-config-data" (OuterVolumeSpecName: "config-data") pod "b5d45483-6b6f-41c8-9d51-8e93c85401db" (UID: "b5d45483-6b6f-41c8-9d51-8e93c85401db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.867490 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.867525 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d45483-6b6f-41c8-9d51-8e93c85401db-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.867535 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqzcf\" (UniqueName: \"kubernetes.io/projected/b5d45483-6b6f-41c8-9d51-8e93c85401db-kube-api-access-qqzcf\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.867546 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.867556 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b5d45483-6b6f-41c8-9d51-8e93c85401db-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.867571 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.867582 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d45483-6b6f-41c8-9d51-8e93c85401db-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.867590 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5d45483-6b6f-41c8-9d51-8e93c85401db-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.886211 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.897825 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.969365 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:27 crc kubenswrapper[4689]: I0307 04:38:27.969396 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.163638 4689 generic.go:334] "Generic (PLEG): container finished" podID="b5d45483-6b6f-41c8-9d51-8e93c85401db" containerID="c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f" exitCode=143 Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.163887 4689 generic.go:334] "Generic (PLEG): container finished" podID="b5d45483-6b6f-41c8-9d51-8e93c85401db" containerID="eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a" exitCode=143 Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.163944 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.163818 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b5d45483-6b6f-41c8-9d51-8e93c85401db","Type":"ContainerDied","Data":"c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f"} Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.164041 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b5d45483-6b6f-41c8-9d51-8e93c85401db","Type":"ContainerDied","Data":"eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a"} Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.164058 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b5d45483-6b6f-41c8-9d51-8e93c85401db","Type":"ContainerDied","Data":"1c1eca631f1f94ae536249793be6b88cae935d1db59d33865610ee54131426d4"} Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.164083 4689 scope.go:117] "RemoveContainer" containerID="c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.200435 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.210694 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.211539 4689 scope.go:117] "RemoveContainer" containerID="eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.232415 4689 scope.go:117] "RemoveContainer" containerID="c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.232684 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:38:28 crc kubenswrapper[4689]: E0307 04:38:28.232966 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d45483-6b6f-41c8-9d51-8e93c85401db" containerName="glance-log" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.232983 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d45483-6b6f-41c8-9d51-8e93c85401db" containerName="glance-log" Mar 07 04:38:28 crc kubenswrapper[4689]: E0307 04:38:28.233006 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d45483-6b6f-41c8-9d51-8e93c85401db" containerName="glance-httpd" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.233012 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d45483-6b6f-41c8-9d51-8e93c85401db" containerName="glance-httpd" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.233117 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d45483-6b6f-41c8-9d51-8e93c85401db" containerName="glance-log" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.233133 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d45483-6b6f-41c8-9d51-8e93c85401db" containerName="glance-httpd" Mar 07 04:38:28 crc kubenswrapper[4689]: E0307 04:38:28.233944 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f\": container with ID starting with c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f not found: ID does not exist" containerID="c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.233983 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f"} err="failed to get container status \"c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f\": rpc error: code = NotFound desc = could not find container \"c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f\": container with ID starting with c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f not found: ID does not exist" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.234011 4689 scope.go:117] "RemoveContainer" containerID="eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a" Mar 07 04:38:28 crc kubenswrapper[4689]: E0307 04:38:28.234292 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a\": container with ID starting with eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a not found: ID does not exist" containerID="eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.234316 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a"} err="failed to get container status \"eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a\": rpc error: code = NotFound desc = could not find container \"eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a\": container with ID starting with eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a not found: ID does not exist" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.234329 4689 scope.go:117] "RemoveContainer" containerID="c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.235264 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f"} err="failed to get container status \"c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f\": rpc error: code = NotFound desc = could not find container \"c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f\": container with ID starting with c57e552384e4801f7fd886ac3a17d4e869eb60dcb321e959e42e407d11613e2f not found: ID does not exist" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.235295 4689 scope.go:117] "RemoveContainer" containerID="eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.238740 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a"} err="failed to get container status \"eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a\": rpc error: code = NotFound desc = could not find container \"eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a\": container with ID starting with eeed9f73476bfec7d7c850243a7b206ae4c1ac1d6288aa855f8eee55a8461f8a not found: ID does not exist" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.244939 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.246964 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375556 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-dev\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375604 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375628 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375660 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375674 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac85295-97e0-4b1d-a8dd-540613931917-scripts\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375694 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8psw4\" (UniqueName: \"kubernetes.io/projected/5ac85295-97e0-4b1d-a8dd-540613931917-kube-api-access-8psw4\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375714 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-run\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375734 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-etc-nvme\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375752 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ac85295-97e0-4b1d-a8dd-540613931917-logs\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375777 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-sys\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375796 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ac85295-97e0-4b1d-a8dd-540613931917-httpd-run\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375819 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-lib-modules\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375835 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.375857 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac85295-97e0-4b1d-a8dd-540613931917-config-data\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.476773 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ac85295-97e0-4b1d-a8dd-540613931917-httpd-run\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.476842 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-lib-modules\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.476879 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.476918 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac85295-97e0-4b1d-a8dd-540613931917-config-data\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.476962 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-lib-modules\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.476972 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-dev\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477024 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-dev\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477091 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477135 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477216 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477249 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac85295-97e0-4b1d-a8dd-540613931917-scripts\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477286 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ac85295-97e0-4b1d-a8dd-540613931917-httpd-run\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477290 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8psw4\" (UniqueName: \"kubernetes.io/projected/5ac85295-97e0-4b1d-a8dd-540613931917-kube-api-access-8psw4\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477359 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-run\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477399 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-etc-nvme\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477424 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ac85295-97e0-4b1d-a8dd-540613931917-logs\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477443 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477543 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477876 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477902 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-run\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.477960 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-etc-nvme\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.478091 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.478303 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-sys\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.478400 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ac85295-97e0-4b1d-a8dd-540613931917-logs\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.478593 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-sys\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.486119 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac85295-97e0-4b1d-a8dd-540613931917-config-data\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.492494 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac85295-97e0-4b1d-a8dd-540613931917-scripts\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.500966 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8psw4\" (UniqueName: \"kubernetes.io/projected/5ac85295-97e0-4b1d-a8dd-540613931917-kube-api-access-8psw4\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.503966 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.521178 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:28 crc kubenswrapper[4689]: I0307 04:38:28.572794 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:29 crc kubenswrapper[4689]: I0307 04:38:29.056545 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:38:29 crc kubenswrapper[4689]: W0307 04:38:29.062109 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ac85295_97e0_4b1d_a8dd_540613931917.slice/crio-a889654f54fae54b4cdd08c2d881718c970baa86c892fdf1519d9b454ea661e6 WatchSource:0}: Error finding container a889654f54fae54b4cdd08c2d881718c970baa86c892fdf1519d9b454ea661e6: Status 404 returned error can't find the container with id a889654f54fae54b4cdd08c2d881718c970baa86c892fdf1519d9b454ea661e6 Mar 07 04:38:29 crc kubenswrapper[4689]: I0307 04:38:29.176012 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"5ac85295-97e0-4b1d-a8dd-540613931917","Type":"ContainerStarted","Data":"a889654f54fae54b4cdd08c2d881718c970baa86c892fdf1519d9b454ea661e6"} Mar 07 04:38:29 crc kubenswrapper[4689]: I0307 04:38:29.189343 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:38:29 crc kubenswrapper[4689]: I0307 04:38:29.189414 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:38:29 crc kubenswrapper[4689]: I0307 04:38:29.836304 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5d45483-6b6f-41c8-9d51-8e93c85401db" path="/var/lib/kubelet/pods/b5d45483-6b6f-41c8-9d51-8e93c85401db/volumes" Mar 07 04:38:30 crc kubenswrapper[4689]: I0307 04:38:30.188750 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"5ac85295-97e0-4b1d-a8dd-540613931917","Type":"ContainerStarted","Data":"97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7"} Mar 07 04:38:30 crc kubenswrapper[4689]: I0307 04:38:30.189087 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"5ac85295-97e0-4b1d-a8dd-540613931917","Type":"ContainerStarted","Data":"cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3"} Mar 07 04:38:30 crc kubenswrapper[4689]: I0307 04:38:30.221128 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=2.221105244 podStartE2EDuration="2.221105244s" podCreationTimestamp="2026-03-07 04:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:38:30.217770285 +0000 UTC m=+1155.264153784" watchObservedRunningTime="2026-03-07 04:38:30.221105244 +0000 UTC m=+1155.267488743" Mar 07 04:38:30 crc kubenswrapper[4689]: I0307 04:38:30.675911 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:30 crc kubenswrapper[4689]: I0307 04:38:30.675987 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:30 crc kubenswrapper[4689]: I0307 04:38:30.718793 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:30 crc kubenswrapper[4689]: I0307 04:38:30.740296 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:31 crc kubenswrapper[4689]: I0307 04:38:31.194785 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:31 crc kubenswrapper[4689]: I0307 04:38:31.194823 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:33 crc kubenswrapper[4689]: I0307 04:38:33.014496 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:33 crc kubenswrapper[4689]: I0307 04:38:33.015499 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:38 crc kubenswrapper[4689]: I0307 04:38:38.583633 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:38 crc kubenswrapper[4689]: I0307 04:38:38.584568 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:38 crc kubenswrapper[4689]: I0307 04:38:38.624970 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:38 crc kubenswrapper[4689]: I0307 04:38:38.651013 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:39 crc kubenswrapper[4689]: I0307 04:38:39.279592 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:39 crc kubenswrapper[4689]: I0307 04:38:39.279670 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:41 crc kubenswrapper[4689]: I0307 04:38:41.112612 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:41 crc kubenswrapper[4689]: I0307 04:38:41.171500 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:38:41 crc kubenswrapper[4689]: I0307 04:38:41.231386 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:38:41 crc kubenswrapper[4689]: I0307 04:38:41.231695 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="cb9869c2-c16b-48fd-ae88-9494cbe75728" containerName="glance-log" containerID="cri-o://b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442" gracePeriod=30 Mar 07 04:38:41 crc kubenswrapper[4689]: I0307 04:38:41.231843 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="cb9869c2-c16b-48fd-ae88-9494cbe75728" containerName="glance-httpd" containerID="cri-o://167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598" gracePeriod=30 Mar 07 04:38:42 crc kubenswrapper[4689]: I0307 04:38:42.305723 4689 generic.go:334] "Generic (PLEG): container finished" podID="cb9869c2-c16b-48fd-ae88-9494cbe75728" containerID="b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442" exitCode=143 Mar 07 04:38:42 crc kubenswrapper[4689]: I0307 04:38:42.306783 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cb9869c2-c16b-48fd-ae88-9494cbe75728","Type":"ContainerDied","Data":"b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442"} Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.783046 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922063 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-etc-iscsi\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922124 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb9869c2-c16b-48fd-ae88-9494cbe75728-logs\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922144 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-dev\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922154 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922215 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-sys\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922239 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-dev" (OuterVolumeSpecName: "dev") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922248 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922268 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-sys" (OuterVolumeSpecName: "sys") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922289 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9869c2-c16b-48fd-ae88-9494cbe75728-config-data\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922329 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922372 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-etc-nvme\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922404 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-lib-modules\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922455 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-run\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922488 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb9869c2-c16b-48fd-ae88-9494cbe75728-httpd-run\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922479 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922506 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922553 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb9869c2-c16b-48fd-ae88-9494cbe75728-scripts\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922579 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrlgb\" (UniqueName: \"kubernetes.io/projected/cb9869c2-c16b-48fd-ae88-9494cbe75728-kube-api-access-wrlgb\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922602 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-var-locks-brick\") pod \"cb9869c2-c16b-48fd-ae88-9494cbe75728\" (UID: \"cb9869c2-c16b-48fd-ae88-9494cbe75728\") " Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922586 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-run" (OuterVolumeSpecName: "run") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922682 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb9869c2-c16b-48fd-ae88-9494cbe75728-logs" (OuterVolumeSpecName: "logs") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922804 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.922846 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb9869c2-c16b-48fd-ae88-9494cbe75728-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.923188 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.923209 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.923221 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb9869c2-c16b-48fd-ae88-9494cbe75728-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.923233 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.923246 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.923257 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb9869c2-c16b-48fd-ae88-9494cbe75728-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.923267 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.923277 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.923288 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cb9869c2-c16b-48fd-ae88-9494cbe75728-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.928129 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.928212 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9869c2-c16b-48fd-ae88-9494cbe75728-kube-api-access-wrlgb" (OuterVolumeSpecName: "kube-api-access-wrlgb") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "kube-api-access-wrlgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.928228 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.928910 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9869c2-c16b-48fd-ae88-9494cbe75728-scripts" (OuterVolumeSpecName: "scripts") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:38:44 crc kubenswrapper[4689]: I0307 04:38:44.963623 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9869c2-c16b-48fd-ae88-9494cbe75728-config-data" (OuterVolumeSpecName: "config-data") pod "cb9869c2-c16b-48fd-ae88-9494cbe75728" (UID: "cb9869c2-c16b-48fd-ae88-9494cbe75728"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.025240 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.025518 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb9869c2-c16b-48fd-ae88-9494cbe75728-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.025625 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrlgb\" (UniqueName: \"kubernetes.io/projected/cb9869c2-c16b-48fd-ae88-9494cbe75728-kube-api-access-wrlgb\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.025749 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.025844 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9869c2-c16b-48fd-ae88-9494cbe75728-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.041928 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.042339 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.127155 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.127202 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.341044 4689 generic.go:334] "Generic (PLEG): container finished" podID="cb9869c2-c16b-48fd-ae88-9494cbe75728" containerID="167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598" exitCode=0 Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.341157 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.341527 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cb9869c2-c16b-48fd-ae88-9494cbe75728","Type":"ContainerDied","Data":"167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598"} Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.341655 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cb9869c2-c16b-48fd-ae88-9494cbe75728","Type":"ContainerDied","Data":"9074dfc4cfbb9c80467a4a2f6cb69901a398b145d118475fc5f9f4f219c8485b"} Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.341774 4689 scope.go:117] "RemoveContainer" containerID="167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.369745 4689 scope.go:117] "RemoveContainer" containerID="b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.373402 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.382611 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.393103 4689 scope.go:117] "RemoveContainer" containerID="167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598" Mar 07 04:38:45 crc kubenswrapper[4689]: E0307 04:38:45.395325 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598\": container with ID starting with 167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598 not found: ID does not exist" containerID="167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.395366 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598"} err="failed to get container status \"167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598\": rpc error: code = NotFound desc = could not find container \"167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598\": container with ID starting with 167a59b92dbdf868d305947871a086045cbdfd5cc73cf09f1ae9ba019e165598 not found: ID does not exist" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.395401 4689 scope.go:117] "RemoveContainer" containerID="b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442" Mar 07 04:38:45 crc kubenswrapper[4689]: E0307 04:38:45.395809 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442\": container with ID starting with b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442 not found: ID does not exist" containerID="b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.395840 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442"} err="failed to get container status \"b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442\": rpc error: code = NotFound desc = could not find container \"b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442\": container with ID starting with b7f678d4c02bde83b905859d0bd54e57d6e303d3c05a48d3e2a920d416806442 not found: ID does not exist" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.409056 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:38:45 crc kubenswrapper[4689]: E0307 04:38:45.409390 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9869c2-c16b-48fd-ae88-9494cbe75728" containerName="glance-log" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.409412 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9869c2-c16b-48fd-ae88-9494cbe75728" containerName="glance-log" Mar 07 04:38:45 crc kubenswrapper[4689]: E0307 04:38:45.409440 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9869c2-c16b-48fd-ae88-9494cbe75728" containerName="glance-httpd" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.409449 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9869c2-c16b-48fd-ae88-9494cbe75728" containerName="glance-httpd" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.409604 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb9869c2-c16b-48fd-ae88-9494cbe75728" containerName="glance-log" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.409642 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb9869c2-c16b-48fd-ae88-9494cbe75728" containerName="glance-httpd" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.410516 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.422936 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533408 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9abad5-4514-4878-b74a-5da9b308c5d6-scripts\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533475 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533516 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-lib-modules\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533535 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9abad5-4514-4878-b74a-5da9b308c5d6-config-data\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533550 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-sys\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533566 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp4fk\" (UniqueName: \"kubernetes.io/projected/8a9abad5-4514-4878-b74a-5da9b308c5d6-kube-api-access-zp4fk\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533584 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533598 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533615 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9abad5-4514-4878-b74a-5da9b308c5d6-logs\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533638 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-dev\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533658 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-run\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533679 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a9abad5-4514-4878-b74a-5da9b308c5d6-httpd-run\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533702 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.533750 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-etc-nvme\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.634623 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-etc-nvme\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.634940 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9abad5-4514-4878-b74a-5da9b308c5d6-scripts\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.635051 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.634793 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-etc-nvme\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.635260 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-lib-modules\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.635390 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9abad5-4514-4878-b74a-5da9b308c5d6-config-data\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.635478 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-sys\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.635626 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp4fk\" (UniqueName: \"kubernetes.io/projected/8a9abad5-4514-4878-b74a-5da9b308c5d6-kube-api-access-zp4fk\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.635383 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-lib-modules\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.635579 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-sys\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.635265 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.636371 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.636465 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.636542 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.636719 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9abad5-4514-4878-b74a-5da9b308c5d6-logs\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.636684 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.637407 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-dev\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.637159 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9abad5-4514-4878-b74a-5da9b308c5d6-logs\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.638585 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-dev\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.638818 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-run\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.639727 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9abad5-4514-4878-b74a-5da9b308c5d6-scripts\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.638710 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-run\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.650305 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.650434 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a9abad5-4514-4878-b74a-5da9b308c5d6-httpd-run\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.650578 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.650723 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a9abad5-4514-4878-b74a-5da9b308c5d6-httpd-run\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.656317 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9abad5-4514-4878-b74a-5da9b308c5d6-config-data\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.659579 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.661636 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp4fk\" (UniqueName: \"kubernetes.io/projected/8a9abad5-4514-4878-b74a-5da9b308c5d6-kube-api-access-zp4fk\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.672997 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-0\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.723546 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:45 crc kubenswrapper[4689]: I0307 04:38:45.836072 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb9869c2-c16b-48fd-ae88-9494cbe75728" path="/var/lib/kubelet/pods/cb9869c2-c16b-48fd-ae88-9494cbe75728/volumes" Mar 07 04:38:46 crc kubenswrapper[4689]: I0307 04:38:46.030511 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:38:46 crc kubenswrapper[4689]: I0307 04:38:46.349804 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8a9abad5-4514-4878-b74a-5da9b308c5d6","Type":"ContainerStarted","Data":"ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e"} Mar 07 04:38:46 crc kubenswrapper[4689]: I0307 04:38:46.350041 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8a9abad5-4514-4878-b74a-5da9b308c5d6","Type":"ContainerStarted","Data":"14cf2d4a79cb25a76206a7e2a91b0098fa73b0f3c67ee3cc9c425810728cb004"} Mar 07 04:38:47 crc kubenswrapper[4689]: I0307 04:38:47.360533 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8a9abad5-4514-4878-b74a-5da9b308c5d6","Type":"ContainerStarted","Data":"af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c"} Mar 07 04:38:47 crc kubenswrapper[4689]: I0307 04:38:47.397840 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.397810178 podStartE2EDuration="2.397810178s" podCreationTimestamp="2026-03-07 04:38:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:38:47.386489064 +0000 UTC m=+1172.432872583" watchObservedRunningTime="2026-03-07 04:38:47.397810178 +0000 UTC m=+1172.444193687" Mar 07 04:38:50 crc kubenswrapper[4689]: I0307 04:38:50.974349 4689 scope.go:117] "RemoveContainer" containerID="31d7a0089316d9e5ab3ae8149601cd5a0cb7f9defe50dda457b7078e8f866103" Mar 07 04:38:55 crc kubenswrapper[4689]: I0307 04:38:55.724619 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:55 crc kubenswrapper[4689]: I0307 04:38:55.725375 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:55 crc kubenswrapper[4689]: I0307 04:38:55.765341 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:55 crc kubenswrapper[4689]: I0307 04:38:55.807157 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:56 crc kubenswrapper[4689]: I0307 04:38:56.453778 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:56 crc kubenswrapper[4689]: I0307 04:38:56.453821 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:58 crc kubenswrapper[4689]: I0307 04:38:58.314238 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:58 crc kubenswrapper[4689]: I0307 04:38:58.394361 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:38:59 crc kubenswrapper[4689]: I0307 04:38:59.190605 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:38:59 crc kubenswrapper[4689]: I0307 04:38:59.190994 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.561422 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-6vnq9"] Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.572350 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-6vnq9"] Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.626851 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance1932-account-delete-z6cn5"] Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.628572 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.652145 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance1932-account-delete-z6cn5"] Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.668764 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.668975 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="8a9abad5-4514-4878-b74a-5da9b308c5d6" containerName="glance-log" containerID="cri-o://ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e" gracePeriod=30 Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.669098 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="8a9abad5-4514-4878-b74a-5da9b308c5d6" containerName="glance-httpd" containerID="cri-o://af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c" gracePeriod=30 Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.680681 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.680950 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="5ac85295-97e0-4b1d-a8dd-540613931917" containerName="glance-log" containerID="cri-o://97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7" gracePeriod=30 Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.681363 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="5ac85295-97e0-4b1d-a8dd-540613931917" containerName="glance-httpd" containerID="cri-o://cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3" gracePeriod=30 Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.708000 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqbfn\" (UniqueName: \"kubernetes.io/projected/8bf3baad-f045-4ab6-a1e2-c96a22526cdc-kube-api-access-gqbfn\") pod \"glance1932-account-delete-z6cn5\" (UID: \"8bf3baad-f045-4ab6-a1e2-c96a22526cdc\") " pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.708367 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bf3baad-f045-4ab6-a1e2-c96a22526cdc-operator-scripts\") pod \"glance1932-account-delete-z6cn5\" (UID: \"8bf3baad-f045-4ab6-a1e2-c96a22526cdc\") " pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.762445 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.762679 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstackclient" podUID="4f8b0c10-1830-4a35-b5d7-a5f00a990965" containerName="openstackclient" containerID="cri-o://b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82" gracePeriod=30 Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.813932 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bf3baad-f045-4ab6-a1e2-c96a22526cdc-operator-scripts\") pod \"glance1932-account-delete-z6cn5\" (UID: \"8bf3baad-f045-4ab6-a1e2-c96a22526cdc\") " pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.814114 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqbfn\" (UniqueName: \"kubernetes.io/projected/8bf3baad-f045-4ab6-a1e2-c96a22526cdc-kube-api-access-gqbfn\") pod \"glance1932-account-delete-z6cn5\" (UID: \"8bf3baad-f045-4ab6-a1e2-c96a22526cdc\") " pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.815034 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bf3baad-f045-4ab6-a1e2-c96a22526cdc-operator-scripts\") pod \"glance1932-account-delete-z6cn5\" (UID: \"8bf3baad-f045-4ab6-a1e2-c96a22526cdc\") " pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.834780 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f560ac3-026a-44d2-8b41-c0ded2d01b49" path="/var/lib/kubelet/pods/6f560ac3-026a-44d2-8b41-c0ded2d01b49/volumes" Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.837495 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqbfn\" (UniqueName: \"kubernetes.io/projected/8bf3baad-f045-4ab6-a1e2-c96a22526cdc-kube-api-access-gqbfn\") pod \"glance1932-account-delete-z6cn5\" (UID: \"8bf3baad-f045-4ab6-a1e2-c96a22526cdc\") " pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" Mar 07 04:39:11 crc kubenswrapper[4689]: I0307 04:39:11.963054 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.153902 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.225696 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-scripts\") pod \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.225742 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-config-secret\") pod \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.225777 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg7nc\" (UniqueName: \"kubernetes.io/projected/4f8b0c10-1830-4a35-b5d7-a5f00a990965-kube-api-access-sg7nc\") pod \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.225823 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-config\") pod \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\" (UID: \"4f8b0c10-1830-4a35-b5d7-a5f00a990965\") " Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.226542 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-scripts" (OuterVolumeSpecName: "openstack-scripts") pod "4f8b0c10-1830-4a35-b5d7-a5f00a990965" (UID: "4f8b0c10-1830-4a35-b5d7-a5f00a990965"). InnerVolumeSpecName "openstack-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.234581 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8b0c10-1830-4a35-b5d7-a5f00a990965-kube-api-access-sg7nc" (OuterVolumeSpecName: "kube-api-access-sg7nc") pod "4f8b0c10-1830-4a35-b5d7-a5f00a990965" (UID: "4f8b0c10-1830-4a35-b5d7-a5f00a990965"). InnerVolumeSpecName "kube-api-access-sg7nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.245810 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4f8b0c10-1830-4a35-b5d7-a5f00a990965" (UID: "4f8b0c10-1830-4a35-b5d7-a5f00a990965"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.250816 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4f8b0c10-1830-4a35-b5d7-a5f00a990965" (UID: "4f8b0c10-1830-4a35-b5d7-a5f00a990965"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.327235 4689 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.327274 4689 reconciler_common.go:293] "Volume detached for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.327286 4689 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4f8b0c10-1830-4a35-b5d7-a5f00a990965-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.327299 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg7nc\" (UniqueName: \"kubernetes.io/projected/4f8b0c10-1830-4a35-b5d7-a5f00a990965-kube-api-access-sg7nc\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.451577 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance1932-account-delete-z6cn5"] Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.614241 4689 generic.go:334] "Generic (PLEG): container finished" podID="4f8b0c10-1830-4a35-b5d7-a5f00a990965" containerID="b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82" exitCode=143 Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.614281 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"4f8b0c10-1830-4a35-b5d7-a5f00a990965","Type":"ContainerDied","Data":"b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82"} Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.614729 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"4f8b0c10-1830-4a35-b5d7-a5f00a990965","Type":"ContainerDied","Data":"1598d143917e920cce41aba148e697a3691ad0d35823fc203d7e8ea6d38e8215"} Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.614757 4689 scope.go:117] "RemoveContainer" containerID="b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.614319 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.617620 4689 generic.go:334] "Generic (PLEG): container finished" podID="8a9abad5-4514-4878-b74a-5da9b308c5d6" containerID="ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e" exitCode=143 Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.617680 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8a9abad5-4514-4878-b74a-5da9b308c5d6","Type":"ContainerDied","Data":"ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e"} Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.620006 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" event={"ID":"8bf3baad-f045-4ab6-a1e2-c96a22526cdc","Type":"ContainerStarted","Data":"eb30690f7cfbde23b181a73ce131bea2aa55e2dfb9c338b2b159325a96c8afe7"} Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.620054 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" event={"ID":"8bf3baad-f045-4ab6-a1e2-c96a22526cdc","Type":"ContainerStarted","Data":"294e6ec4bdc50b9985cb7de07e73eeed975819db8abdd486d2d17644af03a071"} Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.625160 4689 generic.go:334] "Generic (PLEG): container finished" podID="5ac85295-97e0-4b1d-a8dd-540613931917" containerID="97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7" exitCode=143 Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.625207 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"5ac85295-97e0-4b1d-a8dd-540613931917","Type":"ContainerDied","Data":"97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7"} Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.640943 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" podStartSLOduration=1.640922716 podStartE2EDuration="1.640922716s" podCreationTimestamp="2026-03-07 04:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:39:12.640751272 +0000 UTC m=+1197.687134761" watchObservedRunningTime="2026-03-07 04:39:12.640922716 +0000 UTC m=+1197.687306215" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.650113 4689 scope.go:117] "RemoveContainer" containerID="b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82" Mar 07 04:39:12 crc kubenswrapper[4689]: E0307 04:39:12.650639 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82\": container with ID starting with b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82 not found: ID does not exist" containerID="b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.650701 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82"} err="failed to get container status \"b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82\": rpc error: code = NotFound desc = could not find container \"b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82\": container with ID starting with b101f65fd494e1059fdc3a3a9330101741d9b44deb3ecf734724cf3b644dec82 not found: ID does not exist" Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.669062 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Mar 07 04:39:12 crc kubenswrapper[4689]: I0307 04:39:12.677351 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstackclient"] Mar 07 04:39:13 crc kubenswrapper[4689]: I0307 04:39:13.637314 4689 generic.go:334] "Generic (PLEG): container finished" podID="8bf3baad-f045-4ab6-a1e2-c96a22526cdc" containerID="eb30690f7cfbde23b181a73ce131bea2aa55e2dfb9c338b2b159325a96c8afe7" exitCode=0 Mar 07 04:39:13 crc kubenswrapper[4689]: I0307 04:39:13.637360 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" event={"ID":"8bf3baad-f045-4ab6-a1e2-c96a22526cdc","Type":"ContainerDied","Data":"eb30690f7cfbde23b181a73ce131bea2aa55e2dfb9c338b2b159325a96c8afe7"} Mar 07 04:39:13 crc kubenswrapper[4689]: I0307 04:39:13.834371 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f8b0c10-1830-4a35-b5d7-a5f00a990965" path="/var/lib/kubelet/pods/4f8b0c10-1830-4a35-b5d7-a5f00a990965/volumes" Mar 07 04:39:14 crc kubenswrapper[4689]: I0307 04:39:14.982922 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.067647 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bf3baad-f045-4ab6-a1e2-c96a22526cdc-operator-scripts\") pod \"8bf3baad-f045-4ab6-a1e2-c96a22526cdc\" (UID: \"8bf3baad-f045-4ab6-a1e2-c96a22526cdc\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.067817 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqbfn\" (UniqueName: \"kubernetes.io/projected/8bf3baad-f045-4ab6-a1e2-c96a22526cdc-kube-api-access-gqbfn\") pod \"8bf3baad-f045-4ab6-a1e2-c96a22526cdc\" (UID: \"8bf3baad-f045-4ab6-a1e2-c96a22526cdc\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.069384 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf3baad-f045-4ab6-a1e2-c96a22526cdc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8bf3baad-f045-4ab6-a1e2-c96a22526cdc" (UID: "8bf3baad-f045-4ab6-a1e2-c96a22526cdc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.082237 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf3baad-f045-4ab6-a1e2-c96a22526cdc-kube-api-access-gqbfn" (OuterVolumeSpecName: "kube-api-access-gqbfn") pod "8bf3baad-f045-4ab6-a1e2-c96a22526cdc" (UID: "8bf3baad-f045-4ab6-a1e2-c96a22526cdc"). InnerVolumeSpecName "kube-api-access-gqbfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.180868 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqbfn\" (UniqueName: \"kubernetes.io/projected/8bf3baad-f045-4ab6-a1e2-c96a22526cdc-kube-api-access-gqbfn\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.180904 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bf3baad-f045-4ab6-a1e2-c96a22526cdc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.253077 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.281602 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a9abad5-4514-4878-b74a-5da9b308c5d6-httpd-run\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.281672 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-etc-nvme\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.281702 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9abad5-4514-4878-b74a-5da9b308c5d6-scripts\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.281805 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.282281 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-sys\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.282350 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp4fk\" (UniqueName: \"kubernetes.io/projected/8a9abad5-4514-4878-b74a-5da9b308c5d6-kube-api-access-zp4fk\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.282450 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.282470 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-sys" (OuterVolumeSpecName: "sys") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.282482 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-dev\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.282506 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-lib-modules\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.282560 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-var-locks-brick\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.282583 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.282606 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9abad5-4514-4878-b74a-5da9b308c5d6-logs\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.282641 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-run\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.282665 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-etc-iscsi\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.282691 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9abad5-4514-4878-b74a-5da9b308c5d6-config-data\") pod \"8a9abad5-4514-4878-b74a-5da9b308c5d6\" (UID: \"8a9abad5-4514-4878-b74a-5da9b308c5d6\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.283056 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a9abad5-4514-4878-b74a-5da9b308c5d6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.283158 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.283225 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a9abad5-4514-4878-b74a-5da9b308c5d6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.283243 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.283253 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.283326 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-dev" (OuterVolumeSpecName: "dev") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.283727 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a9abad5-4514-4878-b74a-5da9b308c5d6-logs" (OuterVolumeSpecName: "logs") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.283772 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.283799 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-run" (OuterVolumeSpecName: "run") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.283825 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.285163 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9abad5-4514-4878-b74a-5da9b308c5d6-kube-api-access-zp4fk" (OuterVolumeSpecName: "kube-api-access-zp4fk") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "kube-api-access-zp4fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.286580 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9abad5-4514-4878-b74a-5da9b308c5d6-scripts" (OuterVolumeSpecName: "scripts") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.287033 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.287262 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.290045 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.338469 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9abad5-4514-4878-b74a-5da9b308c5d6-config-data" (OuterVolumeSpecName: "config-data") pod "8a9abad5-4514-4878-b74a-5da9b308c5d6" (UID: "8a9abad5-4514-4878-b74a-5da9b308c5d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.384596 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-dev\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.384650 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ac85295-97e0-4b1d-a8dd-540613931917-logs\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.384687 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-etc-nvme\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.384734 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.384786 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ac85295-97e0-4b1d-a8dd-540613931917-httpd-run\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.384802 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-dev" (OuterVolumeSpecName: "dev") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.384831 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-lib-modules\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.384857 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-sys\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.384870 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.384894 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac85295-97e0-4b1d-a8dd-540613931917-config-data\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.384922 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac85295-97e0-4b1d-a8dd-540613931917-scripts\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.384947 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-run\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385040 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-sys" (OuterVolumeSpecName: "sys") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385052 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8psw4\" (UniqueName: \"kubernetes.io/projected/5ac85295-97e0-4b1d-a8dd-540613931917-kube-api-access-8psw4\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385137 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-var-locks-brick\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385197 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac85295-97e0-4b1d-a8dd-540613931917-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385212 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385270 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-etc-iscsi\") pod \"5ac85295-97e0-4b1d-a8dd-540613931917\" (UID: \"5ac85295-97e0-4b1d-a8dd-540613931917\") " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385422 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac85295-97e0-4b1d-a8dd-540613931917-logs" (OuterVolumeSpecName: "logs") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385795 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-run" (OuterVolumeSpecName: "run") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385815 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385835 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385849 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385861 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9abad5-4514-4878-b74a-5da9b308c5d6-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385886 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385898 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385909 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8a9abad5-4514-4878-b74a-5da9b308c5d6-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385921 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9abad5-4514-4878-b74a-5da9b308c5d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385946 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ac85295-97e0-4b1d-a8dd-540613931917-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385957 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385971 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385833 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385983 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9abad5-4514-4878-b74a-5da9b308c5d6-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385838 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385997 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp4fk\" (UniqueName: \"kubernetes.io/projected/8a9abad5-4514-4878-b74a-5da9b308c5d6-kube-api-access-zp4fk\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.385855 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.386026 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ac85295-97e0-4b1d-a8dd-540613931917-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.386049 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.386076 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.387416 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.387731 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.388626 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac85295-97e0-4b1d-a8dd-540613931917-scripts" (OuterVolumeSpecName: "scripts") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.389719 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac85295-97e0-4b1d-a8dd-540613931917-kube-api-access-8psw4" (OuterVolumeSpecName: "kube-api-access-8psw4") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "kube-api-access-8psw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.403112 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.403266 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.431608 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac85295-97e0-4b1d-a8dd-540613931917-config-data" (OuterVolumeSpecName: "config-data") pod "5ac85295-97e0-4b1d-a8dd-540613931917" (UID: "5ac85295-97e0-4b1d-a8dd-540613931917"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.487989 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.488036 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac85295-97e0-4b1d-a8dd-540613931917-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.488057 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac85295-97e0-4b1d-a8dd-540613931917-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.488072 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.488086 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.488103 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8psw4\" (UniqueName: \"kubernetes.io/projected/5ac85295-97e0-4b1d-a8dd-540613931917-kube-api-access-8psw4\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.488121 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.488136 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.488224 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.488245 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5ac85295-97e0-4b1d-a8dd-540613931917-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.488269 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.504580 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.510891 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.589536 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.589578 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.659661 4689 generic.go:334] "Generic (PLEG): container finished" podID="8a9abad5-4514-4878-b74a-5da9b308c5d6" containerID="af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c" exitCode=0 Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.659764 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8a9abad5-4514-4878-b74a-5da9b308c5d6","Type":"ContainerDied","Data":"af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c"} Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.659814 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8a9abad5-4514-4878-b74a-5da9b308c5d6","Type":"ContainerDied","Data":"14cf2d4a79cb25a76206a7e2a91b0098fa73b0f3c67ee3cc9c425810728cb004"} Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.659783 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.659868 4689 scope.go:117] "RemoveContainer" containerID="af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.663428 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" event={"ID":"8bf3baad-f045-4ab6-a1e2-c96a22526cdc","Type":"ContainerDied","Data":"294e6ec4bdc50b9985cb7de07e73eeed975819db8abdd486d2d17644af03a071"} Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.663479 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="294e6ec4bdc50b9985cb7de07e73eeed975819db8abdd486d2d17644af03a071" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.663451 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance1932-account-delete-z6cn5" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.666063 4689 generic.go:334] "Generic (PLEG): container finished" podID="5ac85295-97e0-4b1d-a8dd-540613931917" containerID="cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3" exitCode=0 Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.666125 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"5ac85295-97e0-4b1d-a8dd-540613931917","Type":"ContainerDied","Data":"cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3"} Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.666234 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"5ac85295-97e0-4b1d-a8dd-540613931917","Type":"ContainerDied","Data":"a889654f54fae54b4cdd08c2d881718c970baa86c892fdf1519d9b454ea661e6"} Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.666147 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.703396 4689 scope.go:117] "RemoveContainer" containerID="ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.729648 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.753319 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.756537 4689 scope.go:117] "RemoveContainer" containerID="af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c" Mar 07 04:39:15 crc kubenswrapper[4689]: E0307 04:39:15.757518 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c\": container with ID starting with af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c not found: ID does not exist" containerID="af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.757583 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c"} err="failed to get container status \"af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c\": rpc error: code = NotFound desc = could not find container \"af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c\": container with ID starting with af7bc7f1f6ad2e778b42e02c3afbb2ff5d88803556d90b5d4e1a87d1d33e516c not found: ID does not exist" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.757621 4689 scope.go:117] "RemoveContainer" containerID="ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e" Mar 07 04:39:15 crc kubenswrapper[4689]: E0307 04:39:15.758348 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e\": container with ID starting with ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e not found: ID does not exist" containerID="ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.758396 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e"} err="failed to get container status \"ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e\": rpc error: code = NotFound desc = could not find container \"ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e\": container with ID starting with ca5bd974efb9a8cf7ab5428fae52d54697488d34b57bf9e6292ca0714e03657e not found: ID does not exist" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.758420 4689 scope.go:117] "RemoveContainer" containerID="cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.759543 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.767156 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.786627 4689 scope.go:117] "RemoveContainer" containerID="97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.809478 4689 scope.go:117] "RemoveContainer" containerID="cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3" Mar 07 04:39:15 crc kubenswrapper[4689]: E0307 04:39:15.810034 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3\": container with ID starting with cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3 not found: ID does not exist" containerID="cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.810079 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3"} err="failed to get container status \"cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3\": rpc error: code = NotFound desc = could not find container \"cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3\": container with ID starting with cb95978febe4117a5a634119901bc3b9e22917ffc78fa8ff9395344c329e69f3 not found: ID does not exist" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.810108 4689 scope.go:117] "RemoveContainer" containerID="97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7" Mar 07 04:39:15 crc kubenswrapper[4689]: E0307 04:39:15.810579 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7\": container with ID starting with 97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7 not found: ID does not exist" containerID="97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.810604 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7"} err="failed to get container status \"97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7\": rpc error: code = NotFound desc = could not find container \"97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7\": container with ID starting with 97e8eb1029fdddee2004f20edb5ffc9499de9f146a6232c9787996ebf0a9d3b7 not found: ID does not exist" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.835121 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac85295-97e0-4b1d-a8dd-540613931917" path="/var/lib/kubelet/pods/5ac85295-97e0-4b1d-a8dd-540613931917/volumes" Mar 07 04:39:15 crc kubenswrapper[4689]: I0307 04:39:15.836085 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9abad5-4514-4878-b74a-5da9b308c5d6" path="/var/lib/kubelet/pods/8a9abad5-4514-4878-b74a-5da9b308c5d6/volumes" Mar 07 04:39:16 crc kubenswrapper[4689]: I0307 04:39:16.644789 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-lrppg"] Mar 07 04:39:16 crc kubenswrapper[4689]: I0307 04:39:16.658631 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-lrppg"] Mar 07 04:39:16 crc kubenswrapper[4689]: I0307 04:39:16.671706 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-1932-account-create-update-vqggs"] Mar 07 04:39:16 crc kubenswrapper[4689]: I0307 04:39:16.681945 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance1932-account-delete-z6cn5"] Mar 07 04:39:16 crc kubenswrapper[4689]: I0307 04:39:16.689148 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-1932-account-create-update-vqggs"] Mar 07 04:39:16 crc kubenswrapper[4689]: I0307 04:39:16.696564 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance1932-account-delete-z6cn5"] Mar 07 04:39:17 crc kubenswrapper[4689]: I0307 04:39:17.838278 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="377abb24-c403-4bc4-96c6-904786cddd96" path="/var/lib/kubelet/pods/377abb24-c403-4bc4-96c6-904786cddd96/volumes" Mar 07 04:39:17 crc kubenswrapper[4689]: I0307 04:39:17.839509 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf3baad-f045-4ab6-a1e2-c96a22526cdc" path="/var/lib/kubelet/pods/8bf3baad-f045-4ab6-a1e2-c96a22526cdc/volumes" Mar 07 04:39:17 crc kubenswrapper[4689]: I0307 04:39:17.840698 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf61f191-6961-4d76-bf10-2a6fad17cab5" path="/var/lib/kubelet/pods/bf61f191-6961-4d76-bf10-2a6fad17cab5/volumes" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.244418 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-hphk6"] Mar 07 04:39:18 crc kubenswrapper[4689]: E0307 04:39:18.245951 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf3baad-f045-4ab6-a1e2-c96a22526cdc" containerName="mariadb-account-delete" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.245973 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf3baad-f045-4ab6-a1e2-c96a22526cdc" containerName="mariadb-account-delete" Mar 07 04:39:18 crc kubenswrapper[4689]: E0307 04:39:18.245994 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8b0c10-1830-4a35-b5d7-a5f00a990965" containerName="openstackclient" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.246001 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8b0c10-1830-4a35-b5d7-a5f00a990965" containerName="openstackclient" Mar 07 04:39:18 crc kubenswrapper[4689]: E0307 04:39:18.246013 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac85295-97e0-4b1d-a8dd-540613931917" containerName="glance-httpd" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.246018 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac85295-97e0-4b1d-a8dd-540613931917" containerName="glance-httpd" Mar 07 04:39:18 crc kubenswrapper[4689]: E0307 04:39:18.246033 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9abad5-4514-4878-b74a-5da9b308c5d6" containerName="glance-log" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.246038 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9abad5-4514-4878-b74a-5da9b308c5d6" containerName="glance-log" Mar 07 04:39:18 crc kubenswrapper[4689]: E0307 04:39:18.246047 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9abad5-4514-4878-b74a-5da9b308c5d6" containerName="glance-httpd" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.246053 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9abad5-4514-4878-b74a-5da9b308c5d6" containerName="glance-httpd" Mar 07 04:39:18 crc kubenswrapper[4689]: E0307 04:39:18.246064 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac85295-97e0-4b1d-a8dd-540613931917" containerName="glance-log" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.246070 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac85295-97e0-4b1d-a8dd-540613931917" containerName="glance-log" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.246184 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8b0c10-1830-4a35-b5d7-a5f00a990965" containerName="openstackclient" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.246197 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9abad5-4514-4878-b74a-5da9b308c5d6" containerName="glance-httpd" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.246206 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9abad5-4514-4878-b74a-5da9b308c5d6" containerName="glance-log" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.246213 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf3baad-f045-4ab6-a1e2-c96a22526cdc" containerName="mariadb-account-delete" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.246221 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac85295-97e0-4b1d-a8dd-540613931917" containerName="glance-log" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.246231 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac85295-97e0-4b1d-a8dd-540613931917" containerName="glance-httpd" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.246649 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-hphk6" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.260336 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-hphk6"] Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.266536 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf"] Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.267384 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.269523 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.271690 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf"] Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.370016 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc5jw\" (UniqueName: \"kubernetes.io/projected/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641-kube-api-access-rc5jw\") pod \"glance-db-create-hphk6\" (UID: \"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641\") " pod="glance-kuttl-tests/glance-db-create-hphk6" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.370213 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac245492-4872-4f20-9d9a-4475b571d2e7-operator-scripts\") pod \"glance-6ca4-account-create-update-z8vbf\" (UID: \"ac245492-4872-4f20-9d9a-4475b571d2e7\") " pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.370322 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l2mq\" (UniqueName: \"kubernetes.io/projected/ac245492-4872-4f20-9d9a-4475b571d2e7-kube-api-access-6l2mq\") pod \"glance-6ca4-account-create-update-z8vbf\" (UID: \"ac245492-4872-4f20-9d9a-4475b571d2e7\") " pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.370370 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641-operator-scripts\") pod \"glance-db-create-hphk6\" (UID: \"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641\") " pod="glance-kuttl-tests/glance-db-create-hphk6" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.471262 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc5jw\" (UniqueName: \"kubernetes.io/projected/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641-kube-api-access-rc5jw\") pod \"glance-db-create-hphk6\" (UID: \"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641\") " pod="glance-kuttl-tests/glance-db-create-hphk6" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.471369 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac245492-4872-4f20-9d9a-4475b571d2e7-operator-scripts\") pod \"glance-6ca4-account-create-update-z8vbf\" (UID: \"ac245492-4872-4f20-9d9a-4475b571d2e7\") " pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.471443 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l2mq\" (UniqueName: \"kubernetes.io/projected/ac245492-4872-4f20-9d9a-4475b571d2e7-kube-api-access-6l2mq\") pod \"glance-6ca4-account-create-update-z8vbf\" (UID: \"ac245492-4872-4f20-9d9a-4475b571d2e7\") " pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.471466 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641-operator-scripts\") pod \"glance-db-create-hphk6\" (UID: \"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641\") " pod="glance-kuttl-tests/glance-db-create-hphk6" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.472434 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641-operator-scripts\") pod \"glance-db-create-hphk6\" (UID: \"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641\") " pod="glance-kuttl-tests/glance-db-create-hphk6" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.472846 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac245492-4872-4f20-9d9a-4475b571d2e7-operator-scripts\") pod \"glance-6ca4-account-create-update-z8vbf\" (UID: \"ac245492-4872-4f20-9d9a-4475b571d2e7\") " pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.504511 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l2mq\" (UniqueName: \"kubernetes.io/projected/ac245492-4872-4f20-9d9a-4475b571d2e7-kube-api-access-6l2mq\") pod \"glance-6ca4-account-create-update-z8vbf\" (UID: \"ac245492-4872-4f20-9d9a-4475b571d2e7\") " pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.505624 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc5jw\" (UniqueName: \"kubernetes.io/projected/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641-kube-api-access-rc5jw\") pod \"glance-db-create-hphk6\" (UID: \"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641\") " pod="glance-kuttl-tests/glance-db-create-hphk6" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.567041 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-hphk6" Mar 07 04:39:18 crc kubenswrapper[4689]: I0307 04:39:18.590644 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" Mar 07 04:39:19 crc kubenswrapper[4689]: I0307 04:39:19.047112 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-hphk6"] Mar 07 04:39:19 crc kubenswrapper[4689]: I0307 04:39:19.108474 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf"] Mar 07 04:39:19 crc kubenswrapper[4689]: W0307 04:39:19.124381 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac245492_4872_4f20_9d9a_4475b571d2e7.slice/crio-a586082cf81353a918fcc8227d584e64fda768a25ce39076ea1794f4349a7030 WatchSource:0}: Error finding container a586082cf81353a918fcc8227d584e64fda768a25ce39076ea1794f4349a7030: Status 404 returned error can't find the container with id a586082cf81353a918fcc8227d584e64fda768a25ce39076ea1794f4349a7030 Mar 07 04:39:19 crc kubenswrapper[4689]: I0307 04:39:19.705758 4689 generic.go:334] "Generic (PLEG): container finished" podID="ac245492-4872-4f20-9d9a-4475b571d2e7" containerID="ca950ee2382d076ca1acd44912cd90dac80f7886c7531c374f37d57c8d68baa9" exitCode=0 Mar 07 04:39:19 crc kubenswrapper[4689]: I0307 04:39:19.705904 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" event={"ID":"ac245492-4872-4f20-9d9a-4475b571d2e7","Type":"ContainerDied","Data":"ca950ee2382d076ca1acd44912cd90dac80f7886c7531c374f37d57c8d68baa9"} Mar 07 04:39:19 crc kubenswrapper[4689]: I0307 04:39:19.705966 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" event={"ID":"ac245492-4872-4f20-9d9a-4475b571d2e7","Type":"ContainerStarted","Data":"a586082cf81353a918fcc8227d584e64fda768a25ce39076ea1794f4349a7030"} Mar 07 04:39:19 crc kubenswrapper[4689]: I0307 04:39:19.707957 4689 generic.go:334] "Generic (PLEG): container finished" podID="6f3ac4f8-4f2d-4f1b-bb34-5a884223c641" containerID="20e88a6e6b94587cc0815d0862d76cc505bad41e0e860cf5e1724f7f08245876" exitCode=0 Mar 07 04:39:19 crc kubenswrapper[4689]: I0307 04:39:19.708003 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-hphk6" event={"ID":"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641","Type":"ContainerDied","Data":"20e88a6e6b94587cc0815d0862d76cc505bad41e0e860cf5e1724f7f08245876"} Mar 07 04:39:19 crc kubenswrapper[4689]: I0307 04:39:19.708027 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-hphk6" event={"ID":"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641","Type":"ContainerStarted","Data":"23b4dc2f5f18de98c8f596bc55faf2d2ecf576555b95317018b22e3223639046"} Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.104968 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-hphk6" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.113818 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.212960 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641-operator-scripts\") pod \"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641\" (UID: \"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641\") " Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.213094 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc5jw\" (UniqueName: \"kubernetes.io/projected/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641-kube-api-access-rc5jw\") pod \"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641\" (UID: \"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641\") " Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.213187 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac245492-4872-4f20-9d9a-4475b571d2e7-operator-scripts\") pod \"ac245492-4872-4f20-9d9a-4475b571d2e7\" (UID: \"ac245492-4872-4f20-9d9a-4475b571d2e7\") " Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.213247 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l2mq\" (UniqueName: \"kubernetes.io/projected/ac245492-4872-4f20-9d9a-4475b571d2e7-kube-api-access-6l2mq\") pod \"ac245492-4872-4f20-9d9a-4475b571d2e7\" (UID: \"ac245492-4872-4f20-9d9a-4475b571d2e7\") " Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.214289 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac245492-4872-4f20-9d9a-4475b571d2e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac245492-4872-4f20-9d9a-4475b571d2e7" (UID: "ac245492-4872-4f20-9d9a-4475b571d2e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.214595 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f3ac4f8-4f2d-4f1b-bb34-5a884223c641" (UID: "6f3ac4f8-4f2d-4f1b-bb34-5a884223c641"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.219087 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641-kube-api-access-rc5jw" (OuterVolumeSpecName: "kube-api-access-rc5jw") pod "6f3ac4f8-4f2d-4f1b-bb34-5a884223c641" (UID: "6f3ac4f8-4f2d-4f1b-bb34-5a884223c641"). InnerVolumeSpecName "kube-api-access-rc5jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.220127 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac245492-4872-4f20-9d9a-4475b571d2e7-kube-api-access-6l2mq" (OuterVolumeSpecName: "kube-api-access-6l2mq") pod "ac245492-4872-4f20-9d9a-4475b571d2e7" (UID: "ac245492-4872-4f20-9d9a-4475b571d2e7"). InnerVolumeSpecName "kube-api-access-6l2mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.315128 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc5jw\" (UniqueName: \"kubernetes.io/projected/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641-kube-api-access-rc5jw\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.315851 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac245492-4872-4f20-9d9a-4475b571d2e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.316058 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l2mq\" (UniqueName: \"kubernetes.io/projected/ac245492-4872-4f20-9d9a-4475b571d2e7-kube-api-access-6l2mq\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.316266 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.725741 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" event={"ID":"ac245492-4872-4f20-9d9a-4475b571d2e7","Type":"ContainerDied","Data":"a586082cf81353a918fcc8227d584e64fda768a25ce39076ea1794f4349a7030"} Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.725974 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a586082cf81353a918fcc8227d584e64fda768a25ce39076ea1794f4349a7030" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.725810 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.727603 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-hphk6" event={"ID":"6f3ac4f8-4f2d-4f1b-bb34-5a884223c641","Type":"ContainerDied","Data":"23b4dc2f5f18de98c8f596bc55faf2d2ecf576555b95317018b22e3223639046"} Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.727705 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23b4dc2f5f18de98c8f596bc55faf2d2ecf576555b95317018b22e3223639046" Mar 07 04:39:21 crc kubenswrapper[4689]: I0307 04:39:21.727648 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-hphk6" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.464898 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-d6h8z"] Mar 07 04:39:23 crc kubenswrapper[4689]: E0307 04:39:23.465225 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3ac4f8-4f2d-4f1b-bb34-5a884223c641" containerName="mariadb-database-create" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.465236 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3ac4f8-4f2d-4f1b-bb34-5a884223c641" containerName="mariadb-database-create" Mar 07 04:39:23 crc kubenswrapper[4689]: E0307 04:39:23.465247 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac245492-4872-4f20-9d9a-4475b571d2e7" containerName="mariadb-account-create-update" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.465253 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac245492-4872-4f20-9d9a-4475b571d2e7" containerName="mariadb-account-create-update" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.465364 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3ac4f8-4f2d-4f1b-bb34-5a884223c641" containerName="mariadb-database-create" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.465375 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac245492-4872-4f20-9d9a-4475b571d2e7" containerName="mariadb-account-create-update" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.465864 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.468193 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.468402 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.468756 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-4qxt9" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.479705 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-d6h8z"] Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.551113 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-config-data\") pod \"glance-db-sync-d6h8z\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.551336 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-combined-ca-bundle\") pod \"glance-db-sync-d6h8z\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.551409 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d49z6\" (UniqueName: \"kubernetes.io/projected/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-kube-api-access-d49z6\") pod \"glance-db-sync-d6h8z\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.551489 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-db-sync-config-data\") pod \"glance-db-sync-d6h8z\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.652918 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d49z6\" (UniqueName: \"kubernetes.io/projected/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-kube-api-access-d49z6\") pod \"glance-db-sync-d6h8z\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.652983 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-db-sync-config-data\") pod \"glance-db-sync-d6h8z\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.653064 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-config-data\") pod \"glance-db-sync-d6h8z\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.653130 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-combined-ca-bundle\") pod \"glance-db-sync-d6h8z\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.659857 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-combined-ca-bundle\") pod \"glance-db-sync-d6h8z\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.660413 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-db-sync-config-data\") pod \"glance-db-sync-d6h8z\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.674505 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-config-data\") pod \"glance-db-sync-d6h8z\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.683064 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d49z6\" (UniqueName: \"kubernetes.io/projected/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-kube-api-access-d49z6\") pod \"glance-db-sync-d6h8z\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:23 crc kubenswrapper[4689]: I0307 04:39:23.785004 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:24 crc kubenswrapper[4689]: I0307 04:39:24.303675 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-d6h8z"] Mar 07 04:39:24 crc kubenswrapper[4689]: W0307 04:39:24.311072 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6f4b785_9b37_4aa9_8edd_9f2ff0d07c78.slice/crio-08f9a289e9d8e4399e4dba94846c5cd700627feb9a6548c4b6fcbd4edc60faf5 WatchSource:0}: Error finding container 08f9a289e9d8e4399e4dba94846c5cd700627feb9a6548c4b6fcbd4edc60faf5: Status 404 returned error can't find the container with id 08f9a289e9d8e4399e4dba94846c5cd700627feb9a6548c4b6fcbd4edc60faf5 Mar 07 04:39:24 crc kubenswrapper[4689]: I0307 04:39:24.751843 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-d6h8z" event={"ID":"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78","Type":"ContainerStarted","Data":"08f9a289e9d8e4399e4dba94846c5cd700627feb9a6548c4b6fcbd4edc60faf5"} Mar 07 04:39:25 crc kubenswrapper[4689]: I0307 04:39:25.763378 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-d6h8z" event={"ID":"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78","Type":"ContainerStarted","Data":"2af979f74a740f172876998eb5fab2143962635ed30c4581a1c075c83143a1e6"} Mar 07 04:39:25 crc kubenswrapper[4689]: I0307 04:39:25.788972 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-d6h8z" podStartSLOduration=2.7889499129999997 podStartE2EDuration="2.788949913s" podCreationTimestamp="2026-03-07 04:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:39:25.785131561 +0000 UTC m=+1210.831515070" watchObservedRunningTime="2026-03-07 04:39:25.788949913 +0000 UTC m=+1210.835333412" Mar 07 04:39:27 crc kubenswrapper[4689]: I0307 04:39:27.782833 4689 generic.go:334] "Generic (PLEG): container finished" podID="c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78" containerID="2af979f74a740f172876998eb5fab2143962635ed30c4581a1c075c83143a1e6" exitCode=0 Mar 07 04:39:27 crc kubenswrapper[4689]: I0307 04:39:27.782925 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-d6h8z" event={"ID":"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78","Type":"ContainerDied","Data":"2af979f74a740f172876998eb5fab2143962635ed30c4581a1c075c83143a1e6"} Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.189453 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.189510 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.189569 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.190143 4689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae730408636ba5641da1384c81b782848c445de37ccd29b97d13a35866436afe"} pod="openshift-machine-config-operator/machine-config-daemon-dss5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.190204 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" containerID="cri-o://ae730408636ba5641da1384c81b782848c445de37ccd29b97d13a35866436afe" gracePeriod=600 Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.227059 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.350977 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-combined-ca-bundle\") pod \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.351080 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d49z6\" (UniqueName: \"kubernetes.io/projected/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-kube-api-access-d49z6\") pod \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.351302 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-db-sync-config-data\") pod \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.351465 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-config-data\") pod \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\" (UID: \"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78\") " Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.356366 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-kube-api-access-d49z6" (OuterVolumeSpecName: "kube-api-access-d49z6") pod "c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78" (UID: "c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78"). InnerVolumeSpecName "kube-api-access-d49z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.356428 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78" (UID: "c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.377923 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78" (UID: "c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.397009 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-config-data" (OuterVolumeSpecName: "config-data") pod "c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78" (UID: "c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.453679 4689 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.453732 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d49z6\" (UniqueName: \"kubernetes.io/projected/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-kube-api-access-d49z6\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.453754 4689 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.453774 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.806836 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-d6h8z" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.806826 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-d6h8z" event={"ID":"c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78","Type":"ContainerDied","Data":"08f9a289e9d8e4399e4dba94846c5cd700627feb9a6548c4b6fcbd4edc60faf5"} Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.807091 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08f9a289e9d8e4399e4dba94846c5cd700627feb9a6548c4b6fcbd4edc60faf5" Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.813085 4689 generic.go:334] "Generic (PLEG): container finished" podID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerID="ae730408636ba5641da1384c81b782848c445de37ccd29b97d13a35866436afe" exitCode=0 Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.813339 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerDied","Data":"ae730408636ba5641da1384c81b782848c445de37ccd29b97d13a35866436afe"} Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.813521 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerStarted","Data":"1d7f7f5d4bedb9f0999f9f7b5b22121b12b61459642fd73d8cbc908ec8691b15"} Mar 07 04:39:29 crc kubenswrapper[4689]: I0307 04:39:29.813585 4689 scope.go:117] "RemoveContainer" containerID="095186d39ccb32197b5727728ef69f96ce62106ff83eff2af68654fa691615da" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.202575 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:30 crc kubenswrapper[4689]: E0307 04:39:30.204993 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78" containerName="glance-db-sync" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.205071 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78" containerName="glance-db-sync" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.205286 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78" containerName="glance-db-sync" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.206312 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.209777 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.210036 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.210033 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.210316 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.210482 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-4qxt9" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.210492 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.225590 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.267988 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.268032 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-config-data\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.268065 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-scripts\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.268086 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6861838e-5047-41b1-97cb-55d3f03a5122-logs\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.268232 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdlnn\" (UniqueName: \"kubernetes.io/projected/6861838e-5047-41b1-97cb-55d3f03a5122-kube-api-access-rdlnn\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.268276 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6861838e-5047-41b1-97cb-55d3f03a5122-httpd-run\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.268450 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.268495 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.268609 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.370280 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.370343 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.370387 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.370433 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.370459 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-config-data\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.370488 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-scripts\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.370514 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6861838e-5047-41b1-97cb-55d3f03a5122-logs\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.370560 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdlnn\" (UniqueName: \"kubernetes.io/projected/6861838e-5047-41b1-97cb-55d3f03a5122-kube-api-access-rdlnn\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.370582 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6861838e-5047-41b1-97cb-55d3f03a5122-httpd-run\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.371008 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.371242 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6861838e-5047-41b1-97cb-55d3f03a5122-logs\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.371338 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6861838e-5047-41b1-97cb-55d3f03a5122-httpd-run\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.379110 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.380329 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.380738 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.388884 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-config-data\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.397754 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-scripts\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.405136 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdlnn\" (UniqueName: \"kubernetes.io/projected/6861838e-5047-41b1-97cb-55d3f03a5122-kube-api-access-rdlnn\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.410562 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.522481 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.771984 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:30 crc kubenswrapper[4689]: I0307 04:39:30.823414 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"6861838e-5047-41b1-97cb-55d3f03a5122","Type":"ContainerStarted","Data":"864e7b6973e2b408ff9c277c3ad205f6fc0eedf1d00c1a4c3383463df1f0d2e3"} Mar 07 04:39:31 crc kubenswrapper[4689]: I0307 04:39:31.288894 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:31 crc kubenswrapper[4689]: I0307 04:39:31.835404 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"6861838e-5047-41b1-97cb-55d3f03a5122","Type":"ContainerStarted","Data":"7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8"} Mar 07 04:39:31 crc kubenswrapper[4689]: I0307 04:39:31.836066 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"6861838e-5047-41b1-97cb-55d3f03a5122","Type":"ContainerStarted","Data":"9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b"} Mar 07 04:39:31 crc kubenswrapper[4689]: I0307 04:39:31.835736 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="6861838e-5047-41b1-97cb-55d3f03a5122" containerName="glance-httpd" containerID="cri-o://7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8" gracePeriod=30 Mar 07 04:39:31 crc kubenswrapper[4689]: I0307 04:39:31.835532 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="6861838e-5047-41b1-97cb-55d3f03a5122" containerName="glance-log" containerID="cri-o://9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b" gracePeriod=30 Mar 07 04:39:31 crc kubenswrapper[4689]: I0307 04:39:31.867373 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=1.867355056 podStartE2EDuration="1.867355056s" podCreationTimestamp="2026-03-07 04:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:39:31.865550178 +0000 UTC m=+1216.911933677" watchObservedRunningTime="2026-03-07 04:39:31.867355056 +0000 UTC m=+1216.913738565" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.289766 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.400339 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-internal-tls-certs\") pod \"6861838e-5047-41b1-97cb-55d3f03a5122\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.400463 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6861838e-5047-41b1-97cb-55d3f03a5122-logs\") pod \"6861838e-5047-41b1-97cb-55d3f03a5122\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.400491 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"6861838e-5047-41b1-97cb-55d3f03a5122\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.400510 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-config-data\") pod \"6861838e-5047-41b1-97cb-55d3f03a5122\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.400547 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdlnn\" (UniqueName: \"kubernetes.io/projected/6861838e-5047-41b1-97cb-55d3f03a5122-kube-api-access-rdlnn\") pod \"6861838e-5047-41b1-97cb-55d3f03a5122\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.400594 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-scripts\") pod \"6861838e-5047-41b1-97cb-55d3f03a5122\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.400683 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-public-tls-certs\") pod \"6861838e-5047-41b1-97cb-55d3f03a5122\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.400704 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6861838e-5047-41b1-97cb-55d3f03a5122-httpd-run\") pod \"6861838e-5047-41b1-97cb-55d3f03a5122\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.400727 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-combined-ca-bundle\") pod \"6861838e-5047-41b1-97cb-55d3f03a5122\" (UID: \"6861838e-5047-41b1-97cb-55d3f03a5122\") " Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.400995 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6861838e-5047-41b1-97cb-55d3f03a5122-logs" (OuterVolumeSpecName: "logs") pod "6861838e-5047-41b1-97cb-55d3f03a5122" (UID: "6861838e-5047-41b1-97cb-55d3f03a5122"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.401575 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6861838e-5047-41b1-97cb-55d3f03a5122-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6861838e-5047-41b1-97cb-55d3f03a5122" (UID: "6861838e-5047-41b1-97cb-55d3f03a5122"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.406270 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "6861838e-5047-41b1-97cb-55d3f03a5122" (UID: "6861838e-5047-41b1-97cb-55d3f03a5122"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.407151 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6861838e-5047-41b1-97cb-55d3f03a5122-kube-api-access-rdlnn" (OuterVolumeSpecName: "kube-api-access-rdlnn") pod "6861838e-5047-41b1-97cb-55d3f03a5122" (UID: "6861838e-5047-41b1-97cb-55d3f03a5122"). InnerVolumeSpecName "kube-api-access-rdlnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.408339 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-scripts" (OuterVolumeSpecName: "scripts") pod "6861838e-5047-41b1-97cb-55d3f03a5122" (UID: "6861838e-5047-41b1-97cb-55d3f03a5122"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.437579 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6861838e-5047-41b1-97cb-55d3f03a5122" (UID: "6861838e-5047-41b1-97cb-55d3f03a5122"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.445771 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-config-data" (OuterVolumeSpecName: "config-data") pod "6861838e-5047-41b1-97cb-55d3f03a5122" (UID: "6861838e-5047-41b1-97cb-55d3f03a5122"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.455236 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6861838e-5047-41b1-97cb-55d3f03a5122" (UID: "6861838e-5047-41b1-97cb-55d3f03a5122"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.456523 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6861838e-5047-41b1-97cb-55d3f03a5122" (UID: "6861838e-5047-41b1-97cb-55d3f03a5122"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.501887 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.501922 4689 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.501935 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6861838e-5047-41b1-97cb-55d3f03a5122-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.501949 4689 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.501960 4689 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.501970 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6861838e-5047-41b1-97cb-55d3f03a5122-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.502006 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.502018 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6861838e-5047-41b1-97cb-55d3f03a5122-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.502029 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdlnn\" (UniqueName: \"kubernetes.io/projected/6861838e-5047-41b1-97cb-55d3f03a5122-kube-api-access-rdlnn\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.518859 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.604351 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.851368 4689 generic.go:334] "Generic (PLEG): container finished" podID="6861838e-5047-41b1-97cb-55d3f03a5122" containerID="7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8" exitCode=143 Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.851406 4689 generic.go:334] "Generic (PLEG): container finished" podID="6861838e-5047-41b1-97cb-55d3f03a5122" containerID="9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b" exitCode=143 Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.851428 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"6861838e-5047-41b1-97cb-55d3f03a5122","Type":"ContainerDied","Data":"7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8"} Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.851459 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"6861838e-5047-41b1-97cb-55d3f03a5122","Type":"ContainerDied","Data":"9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b"} Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.851473 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"6861838e-5047-41b1-97cb-55d3f03a5122","Type":"ContainerDied","Data":"864e7b6973e2b408ff9c277c3ad205f6fc0eedf1d00c1a4c3383463df1f0d2e3"} Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.851490 4689 scope.go:117] "RemoveContainer" containerID="7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.851632 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.898591 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.903658 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.912504 4689 scope.go:117] "RemoveContainer" containerID="9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.931003 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:32 crc kubenswrapper[4689]: E0307 04:39:32.931709 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6861838e-5047-41b1-97cb-55d3f03a5122" containerName="glance-log" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.931846 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="6861838e-5047-41b1-97cb-55d3f03a5122" containerName="glance-log" Mar 07 04:39:32 crc kubenswrapper[4689]: E0307 04:39:32.933696 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6861838e-5047-41b1-97cb-55d3f03a5122" containerName="glance-httpd" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.933852 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="6861838e-5047-41b1-97cb-55d3f03a5122" containerName="glance-httpd" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.940159 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="6861838e-5047-41b1-97cb-55d3f03a5122" containerName="glance-httpd" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.940351 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="6861838e-5047-41b1-97cb-55d3f03a5122" containerName="glance-log" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.941383 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.944871 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.944948 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.945343 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-4qxt9" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.945423 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.946236 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.945498 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.945643 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.954882 4689 scope.go:117] "RemoveContainer" containerID="7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8" Mar 07 04:39:32 crc kubenswrapper[4689]: E0307 04:39:32.955885 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8\": container with ID starting with 7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8 not found: ID does not exist" containerID="7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.955960 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8"} err="failed to get container status \"7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8\": rpc error: code = NotFound desc = could not find container \"7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8\": container with ID starting with 7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8 not found: ID does not exist" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.956011 4689 scope.go:117] "RemoveContainer" containerID="9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b" Mar 07 04:39:32 crc kubenswrapper[4689]: E0307 04:39:32.957994 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b\": container with ID starting with 9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b not found: ID does not exist" containerID="9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.958054 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b"} err="failed to get container status \"9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b\": rpc error: code = NotFound desc = could not find container \"9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b\": container with ID starting with 9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b not found: ID does not exist" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.958077 4689 scope.go:117] "RemoveContainer" containerID="7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.958603 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8"} err="failed to get container status \"7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8\": rpc error: code = NotFound desc = could not find container \"7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8\": container with ID starting with 7afbd8b6ddaefbf371f172935a52f3698a9086f285eebe6863c607bff64cd1c8 not found: ID does not exist" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.958651 4689 scope.go:117] "RemoveContainer" containerID="9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b" Mar 07 04:39:32 crc kubenswrapper[4689]: I0307 04:39:32.959310 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b"} err="failed to get container status \"9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b\": rpc error: code = NotFound desc = could not find container \"9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b\": container with ID starting with 9fc159b6d1512dbb539b91eb9e8ee6afbe741346e58ae240e067670b0529327b not found: ID does not exist" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.010153 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.010256 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.010287 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.010320 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.010340 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ab803e-7582-4077-9e65-38f9070815b7-logs\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.010354 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75ab803e-7582-4077-9e65-38f9070815b7-httpd-run\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.010370 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sslcw\" (UniqueName: \"kubernetes.io/projected/75ab803e-7582-4077-9e65-38f9070815b7-kube-api-access-sslcw\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.010392 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-config-data\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.010416 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.111132 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sslcw\" (UniqueName: \"kubernetes.io/projected/75ab803e-7582-4077-9e65-38f9070815b7-kube-api-access-sslcw\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.111203 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-config-data\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.111235 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.111280 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.111332 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.111349 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.111380 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.111399 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ab803e-7582-4077-9e65-38f9070815b7-logs\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.111417 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75ab803e-7582-4077-9e65-38f9070815b7-httpd-run\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.111564 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.122882 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.126844 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.129631 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75ab803e-7582-4077-9e65-38f9070815b7-httpd-run\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.129789 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.130339 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ab803e-7582-4077-9e65-38f9070815b7-logs\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.142777 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.145278 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-config-data\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.152855 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sslcw\" (UniqueName: \"kubernetes.io/projected/75ab803e-7582-4077-9e65-38f9070815b7-kube-api-access-sslcw\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.176513 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.271067 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.725089 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:33 crc kubenswrapper[4689]: W0307 04:39:33.726841 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75ab803e_7582_4077_9e65_38f9070815b7.slice/crio-2126a70606223c7e6e23b6cc01811ace6a89eafab1492f3c0f3e835d4c9b898c WatchSource:0}: Error finding container 2126a70606223c7e6e23b6cc01811ace6a89eafab1492f3c0f3e835d4c9b898c: Status 404 returned error can't find the container with id 2126a70606223c7e6e23b6cc01811ace6a89eafab1492f3c0f3e835d4c9b898c Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.854356 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6861838e-5047-41b1-97cb-55d3f03a5122" path="/var/lib/kubelet/pods/6861838e-5047-41b1-97cb-55d3f03a5122/volumes" Mar 07 04:39:33 crc kubenswrapper[4689]: I0307 04:39:33.865701 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"75ab803e-7582-4077-9e65-38f9070815b7","Type":"ContainerStarted","Data":"2126a70606223c7e6e23b6cc01811ace6a89eafab1492f3c0f3e835d4c9b898c"} Mar 07 04:39:34 crc kubenswrapper[4689]: I0307 04:39:34.881826 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"75ab803e-7582-4077-9e65-38f9070815b7","Type":"ContainerStarted","Data":"668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648"} Mar 07 04:39:34 crc kubenswrapper[4689]: I0307 04:39:34.882475 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"75ab803e-7582-4077-9e65-38f9070815b7","Type":"ContainerStarted","Data":"8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa"} Mar 07 04:39:34 crc kubenswrapper[4689]: I0307 04:39:34.917304 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.917283716 podStartE2EDuration="2.917283716s" podCreationTimestamp="2026-03-07 04:39:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:39:34.915239911 +0000 UTC m=+1219.961623450" watchObservedRunningTime="2026-03-07 04:39:34.917283716 +0000 UTC m=+1219.963667205" Mar 07 04:39:43 crc kubenswrapper[4689]: I0307 04:39:43.271625 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:43 crc kubenswrapper[4689]: I0307 04:39:43.272321 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:43 crc kubenswrapper[4689]: I0307 04:39:43.314994 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:43 crc kubenswrapper[4689]: I0307 04:39:43.330217 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:43 crc kubenswrapper[4689]: I0307 04:39:43.982527 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:43 crc kubenswrapper[4689]: I0307 04:39:43.982967 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:45 crc kubenswrapper[4689]: I0307 04:39:45.858650 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:45 crc kubenswrapper[4689]: I0307 04:39:45.887181 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:47 crc kubenswrapper[4689]: E0307 04:39:47.134714 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Mar 07 04:39:47 crc kubenswrapper[4689]: E0307 04:39:47.134791 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts podName:75ab803e-7582-4077-9e65-38f9070815b7 nodeName:}" failed. No retries permitted until 2026-03-07 04:39:47.634771001 +0000 UTC m=+1232.681154490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts") pod "glance-default-single-0" (UID: "75ab803e-7582-4077-9e65-38f9070815b7") : secret "glance-scripts" not found Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.151044 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-d6h8z"] Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.156949 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-d6h8z"] Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.221084 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance6ca4-account-delete-6fznf"] Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.221884 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.239350 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance6ca4-account-delete-6fznf"] Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.259276 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.341710 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2gr7\" (UniqueName: \"kubernetes.io/projected/8931d922-aca6-4541-a264-a0945fce34cc-kube-api-access-g2gr7\") pod \"glance6ca4-account-delete-6fznf\" (UID: \"8931d922-aca6-4541-a264-a0945fce34cc\") " pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.341780 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8931d922-aca6-4541-a264-a0945fce34cc-operator-scripts\") pod \"glance6ca4-account-delete-6fznf\" (UID: \"8931d922-aca6-4541-a264-a0945fce34cc\") " pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.442780 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2gr7\" (UniqueName: \"kubernetes.io/projected/8931d922-aca6-4541-a264-a0945fce34cc-kube-api-access-g2gr7\") pod \"glance6ca4-account-delete-6fznf\" (UID: \"8931d922-aca6-4541-a264-a0945fce34cc\") " pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.442844 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8931d922-aca6-4541-a264-a0945fce34cc-operator-scripts\") pod \"glance6ca4-account-delete-6fznf\" (UID: \"8931d922-aca6-4541-a264-a0945fce34cc\") " pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.443624 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8931d922-aca6-4541-a264-a0945fce34cc-operator-scripts\") pod \"glance6ca4-account-delete-6fznf\" (UID: \"8931d922-aca6-4541-a264-a0945fce34cc\") " pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.463254 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2gr7\" (UniqueName: \"kubernetes.io/projected/8931d922-aca6-4541-a264-a0945fce34cc-kube-api-access-g2gr7\") pod \"glance6ca4-account-delete-6fznf\" (UID: \"8931d922-aca6-4541-a264-a0945fce34cc\") " pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.547555 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" Mar 07 04:39:47 crc kubenswrapper[4689]: E0307 04:39:47.645497 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Mar 07 04:39:47 crc kubenswrapper[4689]: E0307 04:39:47.645864 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts podName:75ab803e-7582-4077-9e65-38f9070815b7 nodeName:}" failed. No retries permitted until 2026-03-07 04:39:48.645845921 +0000 UTC m=+1233.692229400 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts") pod "glance-default-single-0" (UID: "75ab803e-7582-4077-9e65-38f9070815b7") : secret "glance-scripts" not found Mar 07 04:39:47 crc kubenswrapper[4689]: I0307 04:39:47.835460 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78" path="/var/lib/kubelet/pods/c6f4b785-9b37-4aa9-8edd-9f2ff0d07c78/volumes" Mar 07 04:39:48 crc kubenswrapper[4689]: I0307 04:39:48.013262 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="75ab803e-7582-4077-9e65-38f9070815b7" containerName="glance-log" containerID="cri-o://8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa" gracePeriod=30 Mar 07 04:39:48 crc kubenswrapper[4689]: I0307 04:39:48.013727 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="75ab803e-7582-4077-9e65-38f9070815b7" containerName="glance-httpd" containerID="cri-o://668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648" gracePeriod=30 Mar 07 04:39:48 crc kubenswrapper[4689]: I0307 04:39:48.023211 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance6ca4-account-delete-6fznf"] Mar 07 04:39:48 crc kubenswrapper[4689]: E0307 04:39:48.664997 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Mar 07 04:39:48 crc kubenswrapper[4689]: E0307 04:39:48.665383 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts podName:75ab803e-7582-4077-9e65-38f9070815b7 nodeName:}" failed. No retries permitted until 2026-03-07 04:39:50.665362661 +0000 UTC m=+1235.711746150 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts") pod "glance-default-single-0" (UID: "75ab803e-7582-4077-9e65-38f9070815b7") : secret "glance-scripts" not found Mar 07 04:39:49 crc kubenswrapper[4689]: I0307 04:39:49.021161 4689 generic.go:334] "Generic (PLEG): container finished" podID="8931d922-aca6-4541-a264-a0945fce34cc" containerID="4240dafe00147626cda55455f2cd59a98d2e15661dba5b0ed2b113b507baf83e" exitCode=0 Mar 07 04:39:49 crc kubenswrapper[4689]: I0307 04:39:49.021297 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" event={"ID":"8931d922-aca6-4541-a264-a0945fce34cc","Type":"ContainerDied","Data":"4240dafe00147626cda55455f2cd59a98d2e15661dba5b0ed2b113b507baf83e"} Mar 07 04:39:49 crc kubenswrapper[4689]: I0307 04:39:49.021543 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" event={"ID":"8931d922-aca6-4541-a264-a0945fce34cc","Type":"ContainerStarted","Data":"575147619ddd1609f527ef221c5c9920d5ccf571ac1c8659ee80718c12b7146f"} Mar 07 04:39:49 crc kubenswrapper[4689]: I0307 04:39:49.023952 4689 generic.go:334] "Generic (PLEG): container finished" podID="75ab803e-7582-4077-9e65-38f9070815b7" containerID="8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa" exitCode=143 Mar 07 04:39:49 crc kubenswrapper[4689]: I0307 04:39:49.023984 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"75ab803e-7582-4077-9e65-38f9070815b7","Type":"ContainerDied","Data":"8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa"} Mar 07 04:39:50 crc kubenswrapper[4689]: I0307 04:39:50.376327 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" Mar 07 04:39:50 crc kubenswrapper[4689]: I0307 04:39:50.494949 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8931d922-aca6-4541-a264-a0945fce34cc-operator-scripts\") pod \"8931d922-aca6-4541-a264-a0945fce34cc\" (UID: \"8931d922-aca6-4541-a264-a0945fce34cc\") " Mar 07 04:39:50 crc kubenswrapper[4689]: I0307 04:39:50.495388 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2gr7\" (UniqueName: \"kubernetes.io/projected/8931d922-aca6-4541-a264-a0945fce34cc-kube-api-access-g2gr7\") pod \"8931d922-aca6-4541-a264-a0945fce34cc\" (UID: \"8931d922-aca6-4541-a264-a0945fce34cc\") " Mar 07 04:39:50 crc kubenswrapper[4689]: I0307 04:39:50.495695 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8931d922-aca6-4541-a264-a0945fce34cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8931d922-aca6-4541-a264-a0945fce34cc" (UID: "8931d922-aca6-4541-a264-a0945fce34cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:39:50 crc kubenswrapper[4689]: I0307 04:39:50.510948 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8931d922-aca6-4541-a264-a0945fce34cc-kube-api-access-g2gr7" (OuterVolumeSpecName: "kube-api-access-g2gr7") pod "8931d922-aca6-4541-a264-a0945fce34cc" (UID: "8931d922-aca6-4541-a264-a0945fce34cc"). InnerVolumeSpecName "kube-api-access-g2gr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:39:50 crc kubenswrapper[4689]: I0307 04:39:50.597453 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2gr7\" (UniqueName: \"kubernetes.io/projected/8931d922-aca6-4541-a264-a0945fce34cc-kube-api-access-g2gr7\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:50 crc kubenswrapper[4689]: I0307 04:39:50.597486 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8931d922-aca6-4541-a264-a0945fce34cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:50 crc kubenswrapper[4689]: E0307 04:39:50.698730 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Mar 07 04:39:50 crc kubenswrapper[4689]: E0307 04:39:50.698801 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts podName:75ab803e-7582-4077-9e65-38f9070815b7 nodeName:}" failed. No retries permitted until 2026-03-07 04:39:54.698787166 +0000 UTC m=+1239.745170655 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts") pod "glance-default-single-0" (UID: "75ab803e-7582-4077-9e65-38f9070815b7") : secret "glance-scripts" not found Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.043965 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" event={"ID":"8931d922-aca6-4541-a264-a0945fce34cc","Type":"ContainerDied","Data":"575147619ddd1609f527ef221c5c9920d5ccf571ac1c8659ee80718c12b7146f"} Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.044004 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="575147619ddd1609f527ef221c5c9920d5ccf571ac1c8659ee80718c12b7146f" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.044542 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance6ca4-account-delete-6fznf" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.606662 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.711669 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-config-data\") pod \"75ab803e-7582-4077-9e65-38f9070815b7\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.711767 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-internal-tls-certs\") pod \"75ab803e-7582-4077-9e65-38f9070815b7\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.711868 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts\") pod \"75ab803e-7582-4077-9e65-38f9070815b7\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.711895 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75ab803e-7582-4077-9e65-38f9070815b7-httpd-run\") pod \"75ab803e-7582-4077-9e65-38f9070815b7\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.711945 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ab803e-7582-4077-9e65-38f9070815b7-logs\") pod \"75ab803e-7582-4077-9e65-38f9070815b7\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.711975 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"75ab803e-7582-4077-9e65-38f9070815b7\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.712013 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sslcw\" (UniqueName: \"kubernetes.io/projected/75ab803e-7582-4077-9e65-38f9070815b7-kube-api-access-sslcw\") pod \"75ab803e-7582-4077-9e65-38f9070815b7\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.712046 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-public-tls-certs\") pod \"75ab803e-7582-4077-9e65-38f9070815b7\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.712065 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-combined-ca-bundle\") pod \"75ab803e-7582-4077-9e65-38f9070815b7\" (UID: \"75ab803e-7582-4077-9e65-38f9070815b7\") " Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.712425 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75ab803e-7582-4077-9e65-38f9070815b7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "75ab803e-7582-4077-9e65-38f9070815b7" (UID: "75ab803e-7582-4077-9e65-38f9070815b7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.712666 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75ab803e-7582-4077-9e65-38f9070815b7-logs" (OuterVolumeSpecName: "logs") pod "75ab803e-7582-4077-9e65-38f9070815b7" (UID: "75ab803e-7582-4077-9e65-38f9070815b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.716873 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "75ab803e-7582-4077-9e65-38f9070815b7" (UID: "75ab803e-7582-4077-9e65-38f9070815b7"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.716906 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75ab803e-7582-4077-9e65-38f9070815b7-kube-api-access-sslcw" (OuterVolumeSpecName: "kube-api-access-sslcw") pod "75ab803e-7582-4077-9e65-38f9070815b7" (UID: "75ab803e-7582-4077-9e65-38f9070815b7"). InnerVolumeSpecName "kube-api-access-sslcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.725607 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts" (OuterVolumeSpecName: "scripts") pod "75ab803e-7582-4077-9e65-38f9070815b7" (UID: "75ab803e-7582-4077-9e65-38f9070815b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.743261 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75ab803e-7582-4077-9e65-38f9070815b7" (UID: "75ab803e-7582-4077-9e65-38f9070815b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.763223 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "75ab803e-7582-4077-9e65-38f9070815b7" (UID: "75ab803e-7582-4077-9e65-38f9070815b7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.764698 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-config-data" (OuterVolumeSpecName: "config-data") pod "75ab803e-7582-4077-9e65-38f9070815b7" (UID: "75ab803e-7582-4077-9e65-38f9070815b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.771371 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "75ab803e-7582-4077-9e65-38f9070815b7" (UID: "75ab803e-7582-4077-9e65-38f9070815b7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.813604 4689 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.813646 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.813656 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75ab803e-7582-4077-9e65-38f9070815b7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.813665 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ab803e-7582-4077-9e65-38f9070815b7-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.813699 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.813711 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sslcw\" (UniqueName: \"kubernetes.io/projected/75ab803e-7582-4077-9e65-38f9070815b7-kube-api-access-sslcw\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.813724 4689 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.813736 4689 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.813749 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ab803e-7582-4077-9e65-38f9070815b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.826792 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Mar 07 04:39:51 crc kubenswrapper[4689]: I0307 04:39:51.914969 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.054696 4689 generic.go:334] "Generic (PLEG): container finished" podID="75ab803e-7582-4077-9e65-38f9070815b7" containerID="668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648" exitCode=0 Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.054753 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"75ab803e-7582-4077-9e65-38f9070815b7","Type":"ContainerDied","Data":"668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648"} Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.055007 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"75ab803e-7582-4077-9e65-38f9070815b7","Type":"ContainerDied","Data":"2126a70606223c7e6e23b6cc01811ace6a89eafab1492f3c0f3e835d4c9b898c"} Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.054827 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.055036 4689 scope.go:117] "RemoveContainer" containerID="668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648" Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.108734 4689 scope.go:117] "RemoveContainer" containerID="8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa" Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.110627 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.122030 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.138334 4689 scope.go:117] "RemoveContainer" containerID="668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648" Mar 07 04:39:52 crc kubenswrapper[4689]: E0307 04:39:52.138863 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648\": container with ID starting with 668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648 not found: ID does not exist" containerID="668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648" Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.138909 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648"} err="failed to get container status \"668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648\": rpc error: code = NotFound desc = could not find container \"668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648\": container with ID starting with 668d25b91157be81be34591aef9e43df67e810fc056d8d2daf0387754afbc648 not found: ID does not exist" Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.138940 4689 scope.go:117] "RemoveContainer" containerID="8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa" Mar 07 04:39:52 crc kubenswrapper[4689]: E0307 04:39:52.139436 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa\": container with ID starting with 8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa not found: ID does not exist" containerID="8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa" Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.139476 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa"} err="failed to get container status \"8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa\": rpc error: code = NotFound desc = could not find container \"8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa\": container with ID starting with 8f874f9d7f1aa9b34963955cb20af76f9308582ded8dc4e4362ec3aa7c1355aa not found: ID does not exist" Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.242388 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-hphk6"] Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.253003 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-hphk6"] Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.275516 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf"] Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.282552 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance6ca4-account-delete-6fznf"] Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.290316 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-6ca4-account-create-update-z8vbf"] Mar 07 04:39:52 crc kubenswrapper[4689]: I0307 04:39:52.301162 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance6ca4-account-delete-6fznf"] Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.066067 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-l6584"] Mar 07 04:39:53 crc kubenswrapper[4689]: E0307 04:39:53.066425 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ab803e-7582-4077-9e65-38f9070815b7" containerName="glance-httpd" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.066439 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ab803e-7582-4077-9e65-38f9070815b7" containerName="glance-httpd" Mar 07 04:39:53 crc kubenswrapper[4689]: E0307 04:39:53.066461 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8931d922-aca6-4541-a264-a0945fce34cc" containerName="mariadb-account-delete" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.066469 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="8931d922-aca6-4541-a264-a0945fce34cc" containerName="mariadb-account-delete" Mar 07 04:39:53 crc kubenswrapper[4689]: E0307 04:39:53.066495 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ab803e-7582-4077-9e65-38f9070815b7" containerName="glance-log" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.066503 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ab803e-7582-4077-9e65-38f9070815b7" containerName="glance-log" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.066667 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ab803e-7582-4077-9e65-38f9070815b7" containerName="glance-httpd" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.066682 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ab803e-7582-4077-9e65-38f9070815b7" containerName="glance-log" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.066692 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="8931d922-aca6-4541-a264-a0945fce34cc" containerName="mariadb-account-delete" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.067438 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-l6584" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.083143 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-l6584"] Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.171773 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-9726-account-create-update-brmpz"] Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.172754 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.175006 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.181010 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-9726-account-create-update-brmpz"] Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.239601 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zl54\" (UniqueName: \"kubernetes.io/projected/57bb4f6f-37c7-4ebb-8431-4371063f99a4-kube-api-access-9zl54\") pod \"glance-db-create-l6584\" (UID: \"57bb4f6f-37c7-4ebb-8431-4371063f99a4\") " pod="glance-kuttl-tests/glance-db-create-l6584" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.239655 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57bb4f6f-37c7-4ebb-8431-4371063f99a4-operator-scripts\") pod \"glance-db-create-l6584\" (UID: \"57bb4f6f-37c7-4ebb-8431-4371063f99a4\") " pod="glance-kuttl-tests/glance-db-create-l6584" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.340689 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57bb4f6f-37c7-4ebb-8431-4371063f99a4-operator-scripts\") pod \"glance-db-create-l6584\" (UID: \"57bb4f6f-37c7-4ebb-8431-4371063f99a4\") " pod="glance-kuttl-tests/glance-db-create-l6584" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.340882 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304ba78d-8988-4227-a87a-188ec9bb00d7-operator-scripts\") pod \"glance-9726-account-create-update-brmpz\" (UID: \"304ba78d-8988-4227-a87a-188ec9bb00d7\") " pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.340979 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvssz\" (UniqueName: \"kubernetes.io/projected/304ba78d-8988-4227-a87a-188ec9bb00d7-kube-api-access-dvssz\") pod \"glance-9726-account-create-update-brmpz\" (UID: \"304ba78d-8988-4227-a87a-188ec9bb00d7\") " pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.341046 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zl54\" (UniqueName: \"kubernetes.io/projected/57bb4f6f-37c7-4ebb-8431-4371063f99a4-kube-api-access-9zl54\") pod \"glance-db-create-l6584\" (UID: \"57bb4f6f-37c7-4ebb-8431-4371063f99a4\") " pod="glance-kuttl-tests/glance-db-create-l6584" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.342064 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57bb4f6f-37c7-4ebb-8431-4371063f99a4-operator-scripts\") pod \"glance-db-create-l6584\" (UID: \"57bb4f6f-37c7-4ebb-8431-4371063f99a4\") " pod="glance-kuttl-tests/glance-db-create-l6584" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.371813 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zl54\" (UniqueName: \"kubernetes.io/projected/57bb4f6f-37c7-4ebb-8431-4371063f99a4-kube-api-access-9zl54\") pod \"glance-db-create-l6584\" (UID: \"57bb4f6f-37c7-4ebb-8431-4371063f99a4\") " pod="glance-kuttl-tests/glance-db-create-l6584" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.383606 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-l6584" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.442069 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvssz\" (UniqueName: \"kubernetes.io/projected/304ba78d-8988-4227-a87a-188ec9bb00d7-kube-api-access-dvssz\") pod \"glance-9726-account-create-update-brmpz\" (UID: \"304ba78d-8988-4227-a87a-188ec9bb00d7\") " pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.442269 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304ba78d-8988-4227-a87a-188ec9bb00d7-operator-scripts\") pod \"glance-9726-account-create-update-brmpz\" (UID: \"304ba78d-8988-4227-a87a-188ec9bb00d7\") " pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.443052 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304ba78d-8988-4227-a87a-188ec9bb00d7-operator-scripts\") pod \"glance-9726-account-create-update-brmpz\" (UID: \"304ba78d-8988-4227-a87a-188ec9bb00d7\") " pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.470554 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvssz\" (UniqueName: \"kubernetes.io/projected/304ba78d-8988-4227-a87a-188ec9bb00d7-kube-api-access-dvssz\") pod \"glance-9726-account-create-update-brmpz\" (UID: \"304ba78d-8988-4227-a87a-188ec9bb00d7\") " pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.503687 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.834600 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f3ac4f8-4f2d-4f1b-bb34-5a884223c641" path="/var/lib/kubelet/pods/6f3ac4f8-4f2d-4f1b-bb34-5a884223c641/volumes" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.835442 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75ab803e-7582-4077-9e65-38f9070815b7" path="/var/lib/kubelet/pods/75ab803e-7582-4077-9e65-38f9070815b7/volumes" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.836126 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8931d922-aca6-4541-a264-a0945fce34cc" path="/var/lib/kubelet/pods/8931d922-aca6-4541-a264-a0945fce34cc/volumes" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.837054 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac245492-4872-4f20-9d9a-4475b571d2e7" path="/var/lib/kubelet/pods/ac245492-4872-4f20-9d9a-4475b571d2e7/volumes" Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.910226 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-l6584"] Mar 07 04:39:53 crc kubenswrapper[4689]: I0307 04:39:53.998193 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-9726-account-create-update-brmpz"] Mar 07 04:39:53 crc kubenswrapper[4689]: W0307 04:39:53.998765 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod304ba78d_8988_4227_a87a_188ec9bb00d7.slice/crio-8502c075468b473b4db0374d561e5c5faaf7759ce92676687aaa1ed42cdf79a1 WatchSource:0}: Error finding container 8502c075468b473b4db0374d561e5c5faaf7759ce92676687aaa1ed42cdf79a1: Status 404 returned error can't find the container with id 8502c075468b473b4db0374d561e5c5faaf7759ce92676687aaa1ed42cdf79a1 Mar 07 04:39:54 crc kubenswrapper[4689]: I0307 04:39:54.083820 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" event={"ID":"304ba78d-8988-4227-a87a-188ec9bb00d7","Type":"ContainerStarted","Data":"8502c075468b473b4db0374d561e5c5faaf7759ce92676687aaa1ed42cdf79a1"} Mar 07 04:39:54 crc kubenswrapper[4689]: I0307 04:39:54.085610 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-l6584" event={"ID":"57bb4f6f-37c7-4ebb-8431-4371063f99a4","Type":"ContainerStarted","Data":"8c2fc50de57b0e9a6da42b42437cb93ae36d7629ec62c582a0cb40619c73a64c"} Mar 07 04:39:54 crc kubenswrapper[4689]: I0307 04:39:54.085669 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-l6584" event={"ID":"57bb4f6f-37c7-4ebb-8431-4371063f99a4","Type":"ContainerStarted","Data":"2e8b7696812de242240b4eb2b72f08fe5fb17b5d48a362b16dc8a75233b4410f"} Mar 07 04:39:54 crc kubenswrapper[4689]: I0307 04:39:54.112900 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-create-l6584" podStartSLOduration=1.112881406 podStartE2EDuration="1.112881406s" podCreationTimestamp="2026-03-07 04:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:39:54.107662776 +0000 UTC m=+1239.154046275" watchObservedRunningTime="2026-03-07 04:39:54.112881406 +0000 UTC m=+1239.159264905" Mar 07 04:39:55 crc kubenswrapper[4689]: I0307 04:39:55.096741 4689 generic.go:334] "Generic (PLEG): container finished" podID="57bb4f6f-37c7-4ebb-8431-4371063f99a4" containerID="8c2fc50de57b0e9a6da42b42437cb93ae36d7629ec62c582a0cb40619c73a64c" exitCode=0 Mar 07 04:39:55 crc kubenswrapper[4689]: I0307 04:39:55.096791 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-l6584" event={"ID":"57bb4f6f-37c7-4ebb-8431-4371063f99a4","Type":"ContainerDied","Data":"8c2fc50de57b0e9a6da42b42437cb93ae36d7629ec62c582a0cb40619c73a64c"} Mar 07 04:39:55 crc kubenswrapper[4689]: I0307 04:39:55.098794 4689 generic.go:334] "Generic (PLEG): container finished" podID="304ba78d-8988-4227-a87a-188ec9bb00d7" containerID="db77210edc7bd42b64114823658b941bdfdefc1225d3d7a4f8d6c3b89ed1fe89" exitCode=0 Mar 07 04:39:55 crc kubenswrapper[4689]: I0307 04:39:55.098849 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" event={"ID":"304ba78d-8988-4227-a87a-188ec9bb00d7","Type":"ContainerDied","Data":"db77210edc7bd42b64114823658b941bdfdefc1225d3d7a4f8d6c3b89ed1fe89"} Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.599975 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.615327 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-l6584" Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.694585 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvssz\" (UniqueName: \"kubernetes.io/projected/304ba78d-8988-4227-a87a-188ec9bb00d7-kube-api-access-dvssz\") pod \"304ba78d-8988-4227-a87a-188ec9bb00d7\" (UID: \"304ba78d-8988-4227-a87a-188ec9bb00d7\") " Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.694659 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304ba78d-8988-4227-a87a-188ec9bb00d7-operator-scripts\") pod \"304ba78d-8988-4227-a87a-188ec9bb00d7\" (UID: \"304ba78d-8988-4227-a87a-188ec9bb00d7\") " Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.694727 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zl54\" (UniqueName: \"kubernetes.io/projected/57bb4f6f-37c7-4ebb-8431-4371063f99a4-kube-api-access-9zl54\") pod \"57bb4f6f-37c7-4ebb-8431-4371063f99a4\" (UID: \"57bb4f6f-37c7-4ebb-8431-4371063f99a4\") " Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.694757 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57bb4f6f-37c7-4ebb-8431-4371063f99a4-operator-scripts\") pod \"57bb4f6f-37c7-4ebb-8431-4371063f99a4\" (UID: \"57bb4f6f-37c7-4ebb-8431-4371063f99a4\") " Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.695803 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304ba78d-8988-4227-a87a-188ec9bb00d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "304ba78d-8988-4227-a87a-188ec9bb00d7" (UID: "304ba78d-8988-4227-a87a-188ec9bb00d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.695838 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57bb4f6f-37c7-4ebb-8431-4371063f99a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57bb4f6f-37c7-4ebb-8431-4371063f99a4" (UID: "57bb4f6f-37c7-4ebb-8431-4371063f99a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.701121 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57bb4f6f-37c7-4ebb-8431-4371063f99a4-kube-api-access-9zl54" (OuterVolumeSpecName: "kube-api-access-9zl54") pod "57bb4f6f-37c7-4ebb-8431-4371063f99a4" (UID: "57bb4f6f-37c7-4ebb-8431-4371063f99a4"). InnerVolumeSpecName "kube-api-access-9zl54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.701608 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/304ba78d-8988-4227-a87a-188ec9bb00d7-kube-api-access-dvssz" (OuterVolumeSpecName: "kube-api-access-dvssz") pod "304ba78d-8988-4227-a87a-188ec9bb00d7" (UID: "304ba78d-8988-4227-a87a-188ec9bb00d7"). InnerVolumeSpecName "kube-api-access-dvssz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.796442 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zl54\" (UniqueName: \"kubernetes.io/projected/57bb4f6f-37c7-4ebb-8431-4371063f99a4-kube-api-access-9zl54\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.796478 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57bb4f6f-37c7-4ebb-8431-4371063f99a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.796487 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvssz\" (UniqueName: \"kubernetes.io/projected/304ba78d-8988-4227-a87a-188ec9bb00d7-kube-api-access-dvssz\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:56 crc kubenswrapper[4689]: I0307 04:39:56.796497 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304ba78d-8988-4227-a87a-188ec9bb00d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:39:57 crc kubenswrapper[4689]: I0307 04:39:57.117054 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-l6584" event={"ID":"57bb4f6f-37c7-4ebb-8431-4371063f99a4","Type":"ContainerDied","Data":"2e8b7696812de242240b4eb2b72f08fe5fb17b5d48a362b16dc8a75233b4410f"} Mar 07 04:39:57 crc kubenswrapper[4689]: I0307 04:39:57.117298 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e8b7696812de242240b4eb2b72f08fe5fb17b5d48a362b16dc8a75233b4410f" Mar 07 04:39:57 crc kubenswrapper[4689]: I0307 04:39:57.117352 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-l6584" Mar 07 04:39:57 crc kubenswrapper[4689]: I0307 04:39:57.119850 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" event={"ID":"304ba78d-8988-4227-a87a-188ec9bb00d7","Type":"ContainerDied","Data":"8502c075468b473b4db0374d561e5c5faaf7759ce92676687aaa1ed42cdf79a1"} Mar 07 04:39:57 crc kubenswrapper[4689]: I0307 04:39:57.119902 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8502c075468b473b4db0374d561e5c5faaf7759ce92676687aaa1ed42cdf79a1" Mar 07 04:39:57 crc kubenswrapper[4689]: I0307 04:39:57.119999 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-9726-account-create-update-brmpz" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.304758 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-tbm46"] Mar 07 04:39:58 crc kubenswrapper[4689]: E0307 04:39:58.305107 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57bb4f6f-37c7-4ebb-8431-4371063f99a4" containerName="mariadb-database-create" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.305124 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="57bb4f6f-37c7-4ebb-8431-4371063f99a4" containerName="mariadb-database-create" Mar 07 04:39:58 crc kubenswrapper[4689]: E0307 04:39:58.305165 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304ba78d-8988-4227-a87a-188ec9bb00d7" containerName="mariadb-account-create-update" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.305200 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="304ba78d-8988-4227-a87a-188ec9bb00d7" containerName="mariadb-account-create-update" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.305384 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="304ba78d-8988-4227-a87a-188ec9bb00d7" containerName="mariadb-account-create-update" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.305412 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="57bb4f6f-37c7-4ebb-8431-4371063f99a4" containerName="mariadb-database-create" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.306018 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.307822 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.307841 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-rclx4" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.317106 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-tbm46"] Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.421532 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/910303bb-e941-4f39-ab22-16b6d79339c6-config-data\") pod \"glance-db-sync-tbm46\" (UID: \"910303bb-e941-4f39-ab22-16b6d79339c6\") " pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.421633 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/910303bb-e941-4f39-ab22-16b6d79339c6-db-sync-config-data\") pod \"glance-db-sync-tbm46\" (UID: \"910303bb-e941-4f39-ab22-16b6d79339c6\") " pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.421716 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c74gd\" (UniqueName: \"kubernetes.io/projected/910303bb-e941-4f39-ab22-16b6d79339c6-kube-api-access-c74gd\") pod \"glance-db-sync-tbm46\" (UID: \"910303bb-e941-4f39-ab22-16b6d79339c6\") " pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.523266 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c74gd\" (UniqueName: \"kubernetes.io/projected/910303bb-e941-4f39-ab22-16b6d79339c6-kube-api-access-c74gd\") pod \"glance-db-sync-tbm46\" (UID: \"910303bb-e941-4f39-ab22-16b6d79339c6\") " pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.523388 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/910303bb-e941-4f39-ab22-16b6d79339c6-config-data\") pod \"glance-db-sync-tbm46\" (UID: \"910303bb-e941-4f39-ab22-16b6d79339c6\") " pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.523478 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/910303bb-e941-4f39-ab22-16b6d79339c6-db-sync-config-data\") pod \"glance-db-sync-tbm46\" (UID: \"910303bb-e941-4f39-ab22-16b6d79339c6\") " pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.532323 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/910303bb-e941-4f39-ab22-16b6d79339c6-db-sync-config-data\") pod \"glance-db-sync-tbm46\" (UID: \"910303bb-e941-4f39-ab22-16b6d79339c6\") " pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.532573 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/910303bb-e941-4f39-ab22-16b6d79339c6-config-data\") pod \"glance-db-sync-tbm46\" (UID: \"910303bb-e941-4f39-ab22-16b6d79339c6\") " pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.547204 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c74gd\" (UniqueName: \"kubernetes.io/projected/910303bb-e941-4f39-ab22-16b6d79339c6-kube-api-access-c74gd\") pod \"glance-db-sync-tbm46\" (UID: \"910303bb-e941-4f39-ab22-16b6d79339c6\") " pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.627117 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:39:58 crc kubenswrapper[4689]: W0307 04:39:58.846449 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod910303bb_e941_4f39_ab22_16b6d79339c6.slice/crio-9e52f301e13f91684d7b9d724ebdc99244980d14aa0f320b1b029d4ed4bce66e WatchSource:0}: Error finding container 9e52f301e13f91684d7b9d724ebdc99244980d14aa0f320b1b029d4ed4bce66e: Status 404 returned error can't find the container with id 9e52f301e13f91684d7b9d724ebdc99244980d14aa0f320b1b029d4ed4bce66e Mar 07 04:39:58 crc kubenswrapper[4689]: I0307 04:39:58.847765 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-tbm46"] Mar 07 04:39:59 crc kubenswrapper[4689]: I0307 04:39:59.158210 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-tbm46" event={"ID":"910303bb-e941-4f39-ab22-16b6d79339c6","Type":"ContainerStarted","Data":"9e52f301e13f91684d7b9d724ebdc99244980d14aa0f320b1b029d4ed4bce66e"} Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.152153 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547640-t6v74"] Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.153523 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547640-t6v74" Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.160495 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547640-t6v74"] Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.161849 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.161917 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.161963 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.167516 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-tbm46" event={"ID":"910303bb-e941-4f39-ab22-16b6d79339c6","Type":"ContainerStarted","Data":"6581163c011f345b4ceb2e523c7d3ca03cc42ac901270c85b57fd83e692a48e2"} Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.246361 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n89bx\" (UniqueName: \"kubernetes.io/projected/eced259c-8292-4c5f-9698-a6830b08653a-kube-api-access-n89bx\") pod \"auto-csr-approver-29547640-t6v74\" (UID: \"eced259c-8292-4c5f-9698-a6830b08653a\") " pod="openshift-infra/auto-csr-approver-29547640-t6v74" Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.347973 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n89bx\" (UniqueName: \"kubernetes.io/projected/eced259c-8292-4c5f-9698-a6830b08653a-kube-api-access-n89bx\") pod \"auto-csr-approver-29547640-t6v74\" (UID: \"eced259c-8292-4c5f-9698-a6830b08653a\") " pod="openshift-infra/auto-csr-approver-29547640-t6v74" Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.365041 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n89bx\" (UniqueName: \"kubernetes.io/projected/eced259c-8292-4c5f-9698-a6830b08653a-kube-api-access-n89bx\") pod \"auto-csr-approver-29547640-t6v74\" (UID: \"eced259c-8292-4c5f-9698-a6830b08653a\") " pod="openshift-infra/auto-csr-approver-29547640-t6v74" Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.472689 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547640-t6v74" Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.758027 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-tbm46" podStartSLOduration=2.758002508 podStartE2EDuration="2.758002508s" podCreationTimestamp="2026-03-07 04:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:40:00.202337326 +0000 UTC m=+1245.248720815" watchObservedRunningTime="2026-03-07 04:40:00.758002508 +0000 UTC m=+1245.804385997" Mar 07 04:40:00 crc kubenswrapper[4689]: I0307 04:40:00.764136 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547640-t6v74"] Mar 07 04:40:01 crc kubenswrapper[4689]: I0307 04:40:01.176942 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547640-t6v74" event={"ID":"eced259c-8292-4c5f-9698-a6830b08653a","Type":"ContainerStarted","Data":"323212983fe3082b39fda226a7154596bb1e5442d21c0eb15dba49d26378811d"} Mar 07 04:40:03 crc kubenswrapper[4689]: I0307 04:40:03.214593 4689 generic.go:334] "Generic (PLEG): container finished" podID="eced259c-8292-4c5f-9698-a6830b08653a" containerID="7404fcd03e14e2a6214723039b11259a7b435865c11742201d4bee45bb36582a" exitCode=0 Mar 07 04:40:03 crc kubenswrapper[4689]: I0307 04:40:03.214675 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547640-t6v74" event={"ID":"eced259c-8292-4c5f-9698-a6830b08653a","Type":"ContainerDied","Data":"7404fcd03e14e2a6214723039b11259a7b435865c11742201d4bee45bb36582a"} Mar 07 04:40:03 crc kubenswrapper[4689]: I0307 04:40:03.219044 4689 generic.go:334] "Generic (PLEG): container finished" podID="910303bb-e941-4f39-ab22-16b6d79339c6" containerID="6581163c011f345b4ceb2e523c7d3ca03cc42ac901270c85b57fd83e692a48e2" exitCode=0 Mar 07 04:40:03 crc kubenswrapper[4689]: I0307 04:40:03.219141 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-tbm46" event={"ID":"910303bb-e941-4f39-ab22-16b6d79339c6","Type":"ContainerDied","Data":"6581163c011f345b4ceb2e523c7d3ca03cc42ac901270c85b57fd83e692a48e2"} Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.578787 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547640-t6v74" Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.611316 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n89bx\" (UniqueName: \"kubernetes.io/projected/eced259c-8292-4c5f-9698-a6830b08653a-kube-api-access-n89bx\") pod \"eced259c-8292-4c5f-9698-a6830b08653a\" (UID: \"eced259c-8292-4c5f-9698-a6830b08653a\") " Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.617387 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eced259c-8292-4c5f-9698-a6830b08653a-kube-api-access-n89bx" (OuterVolumeSpecName: "kube-api-access-n89bx") pod "eced259c-8292-4c5f-9698-a6830b08653a" (UID: "eced259c-8292-4c5f-9698-a6830b08653a"). InnerVolumeSpecName "kube-api-access-n89bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.657220 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.712904 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c74gd\" (UniqueName: \"kubernetes.io/projected/910303bb-e941-4f39-ab22-16b6d79339c6-kube-api-access-c74gd\") pod \"910303bb-e941-4f39-ab22-16b6d79339c6\" (UID: \"910303bb-e941-4f39-ab22-16b6d79339c6\") " Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.712976 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/910303bb-e941-4f39-ab22-16b6d79339c6-config-data\") pod \"910303bb-e941-4f39-ab22-16b6d79339c6\" (UID: \"910303bb-e941-4f39-ab22-16b6d79339c6\") " Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.713010 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/910303bb-e941-4f39-ab22-16b6d79339c6-db-sync-config-data\") pod \"910303bb-e941-4f39-ab22-16b6d79339c6\" (UID: \"910303bb-e941-4f39-ab22-16b6d79339c6\") " Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.713729 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n89bx\" (UniqueName: \"kubernetes.io/projected/eced259c-8292-4c5f-9698-a6830b08653a-kube-api-access-n89bx\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.715957 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/910303bb-e941-4f39-ab22-16b6d79339c6-kube-api-access-c74gd" (OuterVolumeSpecName: "kube-api-access-c74gd") pod "910303bb-e941-4f39-ab22-16b6d79339c6" (UID: "910303bb-e941-4f39-ab22-16b6d79339c6"). InnerVolumeSpecName "kube-api-access-c74gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.716629 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910303bb-e941-4f39-ab22-16b6d79339c6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "910303bb-e941-4f39-ab22-16b6d79339c6" (UID: "910303bb-e941-4f39-ab22-16b6d79339c6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.749783 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910303bb-e941-4f39-ab22-16b6d79339c6-config-data" (OuterVolumeSpecName: "config-data") pod "910303bb-e941-4f39-ab22-16b6d79339c6" (UID: "910303bb-e941-4f39-ab22-16b6d79339c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.816119 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c74gd\" (UniqueName: \"kubernetes.io/projected/910303bb-e941-4f39-ab22-16b6d79339c6-kube-api-access-c74gd\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.816196 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/910303bb-e941-4f39-ab22-16b6d79339c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:04 crc kubenswrapper[4689]: I0307 04:40:04.816213 4689 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/910303bb-e941-4f39-ab22-16b6d79339c6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:05 crc kubenswrapper[4689]: I0307 04:40:05.242348 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547640-t6v74" event={"ID":"eced259c-8292-4c5f-9698-a6830b08653a","Type":"ContainerDied","Data":"323212983fe3082b39fda226a7154596bb1e5442d21c0eb15dba49d26378811d"} Mar 07 04:40:05 crc kubenswrapper[4689]: I0307 04:40:05.242381 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547640-t6v74" Mar 07 04:40:05 crc kubenswrapper[4689]: I0307 04:40:05.242400 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="323212983fe3082b39fda226a7154596bb1e5442d21c0eb15dba49d26378811d" Mar 07 04:40:05 crc kubenswrapper[4689]: I0307 04:40:05.245932 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-tbm46" event={"ID":"910303bb-e941-4f39-ab22-16b6d79339c6","Type":"ContainerDied","Data":"9e52f301e13f91684d7b9d724ebdc99244980d14aa0f320b1b029d4ed4bce66e"} Mar 07 04:40:05 crc kubenswrapper[4689]: I0307 04:40:05.245979 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e52f301e13f91684d7b9d724ebdc99244980d14aa0f320b1b029d4ed4bce66e" Mar 07 04:40:05 crc kubenswrapper[4689]: I0307 04:40:05.246041 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-tbm46" Mar 07 04:40:05 crc kubenswrapper[4689]: I0307 04:40:05.653064 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547634-8rcjv"] Mar 07 04:40:05 crc kubenswrapper[4689]: I0307 04:40:05.658264 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547634-8rcjv"] Mar 07 04:40:05 crc kubenswrapper[4689]: I0307 04:40:05.840978 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d21cd9-9297-44bf-8680-246a190f3110" path="/var/lib/kubelet/pods/f6d21cd9-9297-44bf-8680-246a190f3110/volumes" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.372141 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:40:06 crc kubenswrapper[4689]: E0307 04:40:06.372496 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910303bb-e941-4f39-ab22-16b6d79339c6" containerName="glance-db-sync" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.372520 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="910303bb-e941-4f39-ab22-16b6d79339c6" containerName="glance-db-sync" Mar 07 04:40:06 crc kubenswrapper[4689]: E0307 04:40:06.372534 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eced259c-8292-4c5f-9698-a6830b08653a" containerName="oc" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.372542 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="eced259c-8292-4c5f-9698-a6830b08653a" containerName="oc" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.372712 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="eced259c-8292-4c5f-9698-a6830b08653a" containerName="oc" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.372733 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="910303bb-e941-4f39-ab22-16b6d79339c6" containerName="glance-db-sync" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.373904 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.375528 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-rclx4" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.375844 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.385271 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.388266 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.449537 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.449619 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-run\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.449686 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e79db1-aa6d-4206-ab3e-3f722931924d-config-data\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.449719 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5e79db1-aa6d-4206-ab3e-3f722931924d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.449772 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5e79db1-aa6d-4206-ab3e-3f722931924d-logs\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.449797 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.449829 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xtsf\" (UniqueName: \"kubernetes.io/projected/e5e79db1-aa6d-4206-ab3e-3f722931924d-kube-api-access-4xtsf\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.449896 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.449925 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.449948 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.449971 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-dev\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.450021 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e79db1-aa6d-4206-ab3e-3f722931924d-scripts\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.450083 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.450141 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-sys\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551099 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551146 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551175 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551284 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-dev\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551312 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e79db1-aa6d-4206-ab3e-3f722931924d-scripts\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551332 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551349 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551400 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-sys\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551366 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-sys\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551417 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-dev\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551461 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551505 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551525 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551554 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-run\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551612 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e79db1-aa6d-4206-ab3e-3f722931924d-config-data\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551626 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551664 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-run\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551675 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5e79db1-aa6d-4206-ab3e-3f722931924d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551707 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5e79db1-aa6d-4206-ab3e-3f722931924d-logs\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551732 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.551779 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xtsf\" (UniqueName: \"kubernetes.io/projected/e5e79db1-aa6d-4206-ab3e-3f722931924d-kube-api-access-4xtsf\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.552006 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.552033 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.552118 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5e79db1-aa6d-4206-ab3e-3f722931924d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.552141 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5e79db1-aa6d-4206-ab3e-3f722931924d-logs\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.555124 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e79db1-aa6d-4206-ab3e-3f722931924d-scripts\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.556789 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e79db1-aa6d-4206-ab3e-3f722931924d-config-data\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.572693 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xtsf\" (UniqueName: \"kubernetes.io/projected/e5e79db1-aa6d-4206-ab3e-3f722931924d-kube-api-access-4xtsf\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.574766 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.581719 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.650290 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.651713 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.653677 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.662077 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.690542 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.754214 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.754259 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.754285 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-run\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.754303 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.754341 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-dev\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.754417 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.754487 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d678e8d6-5265-47c9-a485-c8048c4edde7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.754544 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.754575 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d678e8d6-5265-47c9-a485-c8048c4edde7-logs\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.754729 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d678e8d6-5265-47c9-a485-c8048c4edde7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.754783 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d678e8d6-5265-47c9-a485-c8048c4edde7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.756113 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-sys\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.756176 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.756234 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt9rv\" (UniqueName: \"kubernetes.io/projected/d678e8d6-5265-47c9-a485-c8048c4edde7-kube-api-access-mt9rv\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857025 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d678e8d6-5265-47c9-a485-c8048c4edde7-logs\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857368 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d678e8d6-5265-47c9-a485-c8048c4edde7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857392 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d678e8d6-5265-47c9-a485-c8048c4edde7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857413 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-sys\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857451 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857474 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt9rv\" (UniqueName: \"kubernetes.io/projected/d678e8d6-5265-47c9-a485-c8048c4edde7-kube-api-access-mt9rv\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857501 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857532 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857560 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-run\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857578 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857623 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-dev\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857652 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857678 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d678e8d6-5265-47c9-a485-c8048c4edde7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857740 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857746 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-sys\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857764 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857795 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d678e8d6-5265-47c9-a485-c8048c4edde7-logs\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.857988 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.858018 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-dev\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.858230 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d678e8d6-5265-47c9-a485-c8048c4edde7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.858283 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.858333 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.858425 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.858445 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.858478 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-run\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.868056 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d678e8d6-5265-47c9-a485-c8048c4edde7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.873800 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d678e8d6-5265-47c9-a485-c8048c4edde7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.883317 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt9rv\" (UniqueName: \"kubernetes.io/projected/d678e8d6-5265-47c9-a485-c8048c4edde7-kube-api-access-mt9rv\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.898432 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.903379 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.930529 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:40:06 crc kubenswrapper[4689]: I0307 04:40:06.966439 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:07 crc kubenswrapper[4689]: I0307 04:40:07.270392 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"e5e79db1-aa6d-4206-ab3e-3f722931924d","Type":"ContainerStarted","Data":"54ecd246f2e60abf9a2d13aca438ca4411ec72e07783071da6d30e62c5b8a471"} Mar 07 04:40:07 crc kubenswrapper[4689]: I0307 04:40:07.271065 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"e5e79db1-aa6d-4206-ab3e-3f722931924d","Type":"ContainerStarted","Data":"39698ded7b0a3ed528848dbb68451914b83d2e6550ad1187448aed1936996a0d"} Mar 07 04:40:07 crc kubenswrapper[4689]: I0307 04:40:07.271128 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"e5e79db1-aa6d-4206-ab3e-3f722931924d","Type":"ContainerStarted","Data":"b77799847633e1df76b9fc201abbd1e10bfc9d4ed78cf17a728b24fce5f5d0bd"} Mar 07 04:40:07 crc kubenswrapper[4689]: I0307 04:40:07.396275 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:40:07 crc kubenswrapper[4689]: W0307 04:40:07.402138 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd678e8d6_5265_47c9_a485_c8048c4edde7.slice/crio-f52741fd1698871c8d6df64335faea56427c663bd139af35af49eb60d8d36a7f WatchSource:0}: Error finding container f52741fd1698871c8d6df64335faea56427c663bd139af35af49eb60d8d36a7f: Status 404 returned error can't find the container with id f52741fd1698871c8d6df64335faea56427c663bd139af35af49eb60d8d36a7f Mar 07 04:40:07 crc kubenswrapper[4689]: I0307 04:40:07.469089 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.279965 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"e5e79db1-aa6d-4206-ab3e-3f722931924d","Type":"ContainerStarted","Data":"34f93fc3f5cf43acb412e66e235fad4f36a65fa810cf86bc11f803e334789008"} Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.288184 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d678e8d6-5265-47c9-a485-c8048c4edde7","Type":"ContainerStarted","Data":"0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76"} Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.288241 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d678e8d6-5265-47c9-a485-c8048c4edde7","Type":"ContainerStarted","Data":"f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3"} Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.288251 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d678e8d6-5265-47c9-a485-c8048c4edde7","Type":"ContainerStarted","Data":"61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985"} Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.288260 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d678e8d6-5265-47c9-a485-c8048c4edde7","Type":"ContainerStarted","Data":"f52741fd1698871c8d6df64335faea56427c663bd139af35af49eb60d8d36a7f"} Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.288378 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerName="glance-log" containerID="cri-o://61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985" gracePeriod=30 Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.288622 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerName="glance-api" containerID="cri-o://0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76" gracePeriod=30 Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.288699 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerName="glance-httpd" containerID="cri-o://f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3" gracePeriod=30 Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.350399 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.350382516 podStartE2EDuration="3.350382516s" podCreationTimestamp="2026-03-07 04:40:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:40:08.349194924 +0000 UTC m=+1253.395578413" watchObservedRunningTime="2026-03-07 04:40:08.350382516 +0000 UTC m=+1253.396766005" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.351903 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.351895907 podStartE2EDuration="2.351895907s" podCreationTimestamp="2026-03-07 04:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:40:08.316618047 +0000 UTC m=+1253.363001536" watchObservedRunningTime="2026-03-07 04:40:08.351895907 +0000 UTC m=+1253.398279396" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.713498 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.887733 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d678e8d6-5265-47c9-a485-c8048c4edde7-httpd-run\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888024 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-etc-nvme\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888089 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d678e8d6-5265-47c9-a485-c8048c4edde7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888121 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888282 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt9rv\" (UniqueName: \"kubernetes.io/projected/d678e8d6-5265-47c9-a485-c8048c4edde7-kube-api-access-mt9rv\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888393 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-run\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888458 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-run" (OuterVolumeSpecName: "run") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888546 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-lib-modules\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888621 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-etc-iscsi\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888583 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888686 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d678e8d6-5265-47c9-a485-c8048c4edde7-scripts\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888812 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888752 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.888950 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d678e8d6-5265-47c9-a485-c8048c4edde7-logs\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889024 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889097 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-var-locks-brick\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889178 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-dev\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889260 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d678e8d6-5265-47c9-a485-c8048c4edde7-logs" (OuterVolumeSpecName: "logs") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889274 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d678e8d6-5265-47c9-a485-c8048c4edde7-config-data\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889360 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-sys\") pod \"d678e8d6-5265-47c9-a485-c8048c4edde7\" (UID: \"d678e8d6-5265-47c9-a485-c8048c4edde7\") " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889460 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889501 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-dev" (OuterVolumeSpecName: "dev") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889880 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d678e8d6-5265-47c9-a485-c8048c4edde7-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889898 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889912 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889871 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-sys" (OuterVolumeSpecName: "sys") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889920 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d678e8d6-5265-47c9-a485-c8048c4edde7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889970 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889986 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.889997 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.890006 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.893086 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.896399 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d678e8d6-5265-47c9-a485-c8048c4edde7-scripts" (OuterVolumeSpecName: "scripts") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.899586 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d678e8d6-5265-47c9-a485-c8048c4edde7-kube-api-access-mt9rv" (OuterVolumeSpecName: "kube-api-access-mt9rv") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "kube-api-access-mt9rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.910336 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance-cache") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.988244 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d678e8d6-5265-47c9-a485-c8048c4edde7-config-data" (OuterVolumeSpecName: "config-data") pod "d678e8d6-5265-47c9-a485-c8048c4edde7" (UID: "d678e8d6-5265-47c9-a485-c8048c4edde7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.991135 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt9rv\" (UniqueName: \"kubernetes.io/projected/d678e8d6-5265-47c9-a485-c8048c4edde7-kube-api-access-mt9rv\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.991170 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d678e8d6-5265-47c9-a485-c8048c4edde7-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.991211 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.991225 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.991234 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d678e8d6-5265-47c9-a485-c8048c4edde7-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:08 crc kubenswrapper[4689]: I0307 04:40:08.991245 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d678e8d6-5265-47c9-a485-c8048c4edde7-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.005097 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.013012 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.092752 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.092783 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.310221 4689 generic.go:334] "Generic (PLEG): container finished" podID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerID="0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76" exitCode=143 Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.310268 4689 generic.go:334] "Generic (PLEG): container finished" podID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerID="f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3" exitCode=143 Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.310286 4689 generic.go:334] "Generic (PLEG): container finished" podID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerID="61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985" exitCode=143 Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.310308 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d678e8d6-5265-47c9-a485-c8048c4edde7","Type":"ContainerDied","Data":"0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76"} Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.310363 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d678e8d6-5265-47c9-a485-c8048c4edde7","Type":"ContainerDied","Data":"f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3"} Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.310384 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d678e8d6-5265-47c9-a485-c8048c4edde7","Type":"ContainerDied","Data":"61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985"} Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.310404 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d678e8d6-5265-47c9-a485-c8048c4edde7","Type":"ContainerDied","Data":"f52741fd1698871c8d6df64335faea56427c663bd139af35af49eb60d8d36a7f"} Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.310428 4689 scope.go:117] "RemoveContainer" containerID="0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.310284 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.331409 4689 scope.go:117] "RemoveContainer" containerID="f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.358747 4689 scope.go:117] "RemoveContainer" containerID="61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.374146 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.395709 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.401903 4689 scope.go:117] "RemoveContainer" containerID="0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76" Mar 07 04:40:09 crc kubenswrapper[4689]: E0307 04:40:09.402320 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76\": container with ID starting with 0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76 not found: ID does not exist" containerID="0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.402362 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76"} err="failed to get container status \"0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76\": rpc error: code = NotFound desc = could not find container \"0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76\": container with ID starting with 0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76 not found: ID does not exist" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.402385 4689 scope.go:117] "RemoveContainer" containerID="f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3" Mar 07 04:40:09 crc kubenswrapper[4689]: E0307 04:40:09.402586 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3\": container with ID starting with f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3 not found: ID does not exist" containerID="f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.402612 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3"} err="failed to get container status \"f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3\": rpc error: code = NotFound desc = could not find container \"f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3\": container with ID starting with f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3 not found: ID does not exist" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.402628 4689 scope.go:117] "RemoveContainer" containerID="61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985" Mar 07 04:40:09 crc kubenswrapper[4689]: E0307 04:40:09.403392 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985\": container with ID starting with 61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985 not found: ID does not exist" containerID="61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.403441 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985"} err="failed to get container status \"61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985\": rpc error: code = NotFound desc = could not find container \"61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985\": container with ID starting with 61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985 not found: ID does not exist" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.403469 4689 scope.go:117] "RemoveContainer" containerID="0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.405410 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76"} err="failed to get container status \"0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76\": rpc error: code = NotFound desc = could not find container \"0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76\": container with ID starting with 0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76 not found: ID does not exist" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.405494 4689 scope.go:117] "RemoveContainer" containerID="f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.405885 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3"} err="failed to get container status \"f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3\": rpc error: code = NotFound desc = could not find container \"f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3\": container with ID starting with f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3 not found: ID does not exist" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.405933 4689 scope.go:117] "RemoveContainer" containerID="61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.406335 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985"} err="failed to get container status \"61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985\": rpc error: code = NotFound desc = could not find container \"61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985\": container with ID starting with 61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985 not found: ID does not exist" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.406371 4689 scope.go:117] "RemoveContainer" containerID="0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.406784 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:40:09 crc kubenswrapper[4689]: E0307 04:40:09.407210 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerName="glance-httpd" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.407294 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerName="glance-httpd" Mar 07 04:40:09 crc kubenswrapper[4689]: E0307 04:40:09.407390 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerName="glance-log" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.407480 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerName="glance-log" Mar 07 04:40:09 crc kubenswrapper[4689]: E0307 04:40:09.407566 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerName="glance-api" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.407638 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerName="glance-api" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.407868 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerName="glance-log" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.407949 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerName="glance-api" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.408032 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" containerName="glance-httpd" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.408070 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76"} err="failed to get container status \"0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76\": rpc error: code = NotFound desc = could not find container \"0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76\": container with ID starting with 0ffc86c7c85fb577263bdb81376fb8fbb9eed87ec14421ca4404722f7b8fef76 not found: ID does not exist" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.408265 4689 scope.go:117] "RemoveContainer" containerID="f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.408581 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3"} err="failed to get container status \"f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3\": rpc error: code = NotFound desc = could not find container \"f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3\": container with ID starting with f1fdc0d47ee9a3a988843b6db55b4892cfc48ceda1566b23bb9d2553651540f3 not found: ID does not exist" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.408610 4689 scope.go:117] "RemoveContainer" containerID="61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.408913 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985"} err="failed to get container status \"61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985\": rpc error: code = NotFound desc = could not find container \"61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985\": container with ID starting with 61a048badfdb2b5da975fcf760a347e4656e33f661370caebb96261c3d139985 not found: ID does not exist" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.409610 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.412733 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.415138 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.599845 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.600013 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.600112 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-dev\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.600232 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992376fa-f803-4a38-859a-3ddc5b52a191-scripts\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.600341 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.600426 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/992376fa-f803-4a38-859a-3ddc5b52a191-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.600556 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.600672 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmg9v\" (UniqueName: \"kubernetes.io/projected/992376fa-f803-4a38-859a-3ddc5b52a191-kube-api-access-pmg9v\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.600769 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.600867 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992376fa-f803-4a38-859a-3ddc5b52a191-logs\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.601029 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992376fa-f803-4a38-859a-3ddc5b52a191-config-data\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.601175 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-sys\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.601400 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-run\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.601580 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702548 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702616 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmg9v\" (UniqueName: \"kubernetes.io/projected/992376fa-f803-4a38-859a-3ddc5b52a191-kube-api-access-pmg9v\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702637 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702658 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992376fa-f803-4a38-859a-3ddc5b52a191-logs\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702680 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992376fa-f803-4a38-859a-3ddc5b52a191-config-data\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702702 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-sys\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702726 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-run\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702723 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702778 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702752 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702911 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702930 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702970 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.702994 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-dev\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.703025 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992376fa-f803-4a38-859a-3ddc5b52a191-scripts\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.703089 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/992376fa-f803-4a38-859a-3ddc5b52a191-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.703110 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.703305 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.704047 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992376fa-f803-4a38-859a-3ddc5b52a191-logs\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.704218 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-dev\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.704330 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.704468 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.704576 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-sys\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.704679 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-run\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.705447 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/992376fa-f803-4a38-859a-3ddc5b52a191-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.710250 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992376fa-f803-4a38-859a-3ddc5b52a191-config-data\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.718015 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992376fa-f803-4a38-859a-3ddc5b52a191-scripts\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.721477 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.725525 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmg9v\" (UniqueName: \"kubernetes.io/projected/992376fa-f803-4a38-859a-3ddc5b52a191-kube-api-access-pmg9v\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.733817 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:09 crc kubenswrapper[4689]: I0307 04:40:09.836089 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d678e8d6-5265-47c9-a485-c8048c4edde7" path="/var/lib/kubelet/pods/d678e8d6-5265-47c9-a485-c8048c4edde7/volumes" Mar 07 04:40:10 crc kubenswrapper[4689]: I0307 04:40:10.026466 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:10 crc kubenswrapper[4689]: I0307 04:40:10.361268 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:40:10 crc kubenswrapper[4689]: W0307 04:40:10.367031 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod992376fa_f803_4a38_859a_3ddc5b52a191.slice/crio-947a912681b3b8dd37306772b392b6ec7d6c91e5ad787671e07d4353c4890062 WatchSource:0}: Error finding container 947a912681b3b8dd37306772b392b6ec7d6c91e5ad787671e07d4353c4890062: Status 404 returned error can't find the container with id 947a912681b3b8dd37306772b392b6ec7d6c91e5ad787671e07d4353c4890062 Mar 07 04:40:11 crc kubenswrapper[4689]: I0307 04:40:11.345808 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"992376fa-f803-4a38-859a-3ddc5b52a191","Type":"ContainerStarted","Data":"547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b"} Mar 07 04:40:11 crc kubenswrapper[4689]: I0307 04:40:11.346720 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"992376fa-f803-4a38-859a-3ddc5b52a191","Type":"ContainerStarted","Data":"efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a"} Mar 07 04:40:11 crc kubenswrapper[4689]: I0307 04:40:11.346804 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"992376fa-f803-4a38-859a-3ddc5b52a191","Type":"ContainerStarted","Data":"2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5"} Mar 07 04:40:11 crc kubenswrapper[4689]: I0307 04:40:11.346878 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"992376fa-f803-4a38-859a-3ddc5b52a191","Type":"ContainerStarted","Data":"947a912681b3b8dd37306772b392b6ec7d6c91e5ad787671e07d4353c4890062"} Mar 07 04:40:11 crc kubenswrapper[4689]: I0307 04:40:11.410736 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.410706571 podStartE2EDuration="2.410706571s" podCreationTimestamp="2026-03-07 04:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:40:11.405635504 +0000 UTC m=+1256.452019013" watchObservedRunningTime="2026-03-07 04:40:11.410706571 +0000 UTC m=+1256.457090100" Mar 07 04:40:16 crc kubenswrapper[4689]: I0307 04:40:16.693059 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:16 crc kubenswrapper[4689]: I0307 04:40:16.693685 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:16 crc kubenswrapper[4689]: I0307 04:40:16.693704 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:16 crc kubenswrapper[4689]: I0307 04:40:16.720021 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:16 crc kubenswrapper[4689]: I0307 04:40:16.734679 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:16 crc kubenswrapper[4689]: I0307 04:40:16.743122 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:17 crc kubenswrapper[4689]: I0307 04:40:17.409582 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:17 crc kubenswrapper[4689]: I0307 04:40:17.409643 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:17 crc kubenswrapper[4689]: I0307 04:40:17.409663 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:17 crc kubenswrapper[4689]: I0307 04:40:17.422802 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:17 crc kubenswrapper[4689]: I0307 04:40:17.425461 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:17 crc kubenswrapper[4689]: I0307 04:40:17.428667 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:20 crc kubenswrapper[4689]: I0307 04:40:20.027508 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:20 crc kubenswrapper[4689]: I0307 04:40:20.029493 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:20 crc kubenswrapper[4689]: I0307 04:40:20.029725 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:20 crc kubenswrapper[4689]: I0307 04:40:20.068771 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:20 crc kubenswrapper[4689]: I0307 04:40:20.070927 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:20 crc kubenswrapper[4689]: I0307 04:40:20.094049 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:20 crc kubenswrapper[4689]: I0307 04:40:20.435824 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:20 crc kubenswrapper[4689]: I0307 04:40:20.435861 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:20 crc kubenswrapper[4689]: I0307 04:40:20.435872 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:20 crc kubenswrapper[4689]: I0307 04:40:20.450768 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:20 crc kubenswrapper[4689]: I0307 04:40:20.463619 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:20 crc kubenswrapper[4689]: I0307 04:40:20.470958 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.449209 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.486978 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.497608 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.497656 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.506711 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.513859 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.557086 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.558845 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.568420 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.571791 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.589496 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.594917 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.630244 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0a16f2-251d-4b9c-a03c-336d12a54add-config-data\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.630531 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-run\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.630653 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-sys\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.630738 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qpr8\" (UniqueName: \"kubernetes.io/projected/4c0a16f2-251d-4b9c-a03c-336d12a54add-kube-api-access-8qpr8\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.630848 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-dev\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.631046 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.631235 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.631364 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.631453 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c0a16f2-251d-4b9c-a03c-336d12a54add-scripts\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.631538 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.631882 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c0a16f2-251d-4b9c-a03c-336d12a54add-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.632008 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.632144 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0a16f2-251d-4b9c-a03c-336d12a54add-logs\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.632271 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.632689 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.734333 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.734599 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.734993 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.735146 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.735275 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c0a16f2-251d-4b9c-a03c-336d12a54add-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.735873 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c10bcf-0231-449b-8a8d-4f7dd44f7547-config-data\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.736000 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.736097 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-dev\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.737026 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0a16f2-251d-4b9c-a03c-336d12a54add-logs\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.737403 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.737595 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.737688 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.737789 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-sys\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.737898 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.738159 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c0a16f2-251d-4b9c-a03c-336d12a54add-scripts\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.735696 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c0a16f2-251d-4b9c-a03c-336d12a54add-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.737551 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.735116 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.734948 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.735245 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.734504 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.737369 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0a16f2-251d-4b9c-a03c-336d12a54add-logs\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.739369 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-run\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.739486 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.739585 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27c10bcf-0231-449b-8a8d-4f7dd44f7547-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.739686 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98msr\" (UniqueName: \"kubernetes.io/projected/27c10bcf-0231-449b-8a8d-4f7dd44f7547-kube-api-access-98msr\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.739807 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27c10bcf-0231-449b-8a8d-4f7dd44f7547-logs\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.739941 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.740050 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.739590 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.740330 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0a16f2-251d-4b9c-a03c-336d12a54add-config-data\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.740366 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-run\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.740403 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qpr8\" (UniqueName: \"kubernetes.io/projected/4c0a16f2-251d-4b9c-a03c-336d12a54add-kube-api-access-8qpr8\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.740421 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-sys\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.740450 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27c10bcf-0231-449b-8a8d-4f7dd44f7547-scripts\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.740485 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.740522 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-dev\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.740681 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-dev\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.740843 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-sys\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.740876 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-run\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.743981 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0a16f2-251d-4b9c-a03c-336d12a54add-config-data\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.744143 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c0a16f2-251d-4b9c-a03c-336d12a54add-scripts\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.757530 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.757779 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.762828 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.767287 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qpr8\" (UniqueName: \"kubernetes.io/projected/4c0a16f2-251d-4b9c-a03c-336d12a54add-kube-api-access-8qpr8\") pod \"glance-default-external-api-1\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.839955 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842033 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842102 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842156 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-sys\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842248 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842303 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b8ccc32-b665-4c4a-bece-bda801b97ba8-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842339 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-run\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842380 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-dev\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842418 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27c10bcf-0231-449b-8a8d-4f7dd44f7547-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842456 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98msr\" (UniqueName: \"kubernetes.io/projected/27c10bcf-0231-449b-8a8d-4f7dd44f7547-kube-api-access-98msr\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842511 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27c10bcf-0231-449b-8a8d-4f7dd44f7547-logs\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842557 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0982b9c9-87f9-40a1-b776-1e889e04caa4-scripts\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842616 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0982b9c9-87f9-40a1-b776-1e889e04caa4-config-data\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842657 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842695 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8ccc32-b665-4c4a-bece-bda801b97ba8-scripts\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842732 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27c10bcf-0231-449b-8a8d-4f7dd44f7547-scripts\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842768 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842803 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t92x2\" (UniqueName: \"kubernetes.io/projected/0982b9c9-87f9-40a1-b776-1e889e04caa4-kube-api-access-t92x2\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842839 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842880 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-run\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842920 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.842959 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-sys\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843006 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b8ccc32-b665-4c4a-bece-bda801b97ba8-logs\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843043 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843079 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843108 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0982b9c9-87f9-40a1-b776-1e889e04caa4-logs\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843149 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843217 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c10bcf-0231-449b-8a8d-4f7dd44f7547-config-data\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843251 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843287 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-887h9\" (UniqueName: \"kubernetes.io/projected/1b8ccc32-b665-4c4a-bece-bda801b97ba8-kube-api-access-887h9\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843322 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-dev\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843361 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8ccc32-b665-4c4a-bece-bda801b97ba8-config-data\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843397 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0982b9c9-87f9-40a1-b776-1e889e04caa4-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843439 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843490 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843533 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843577 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843610 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-run\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843640 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-sys\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843676 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-dev\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843711 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843745 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843902 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.843970 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.844026 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-sys\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.844239 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.844869 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.844904 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-dev\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.845248 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27c10bcf-0231-449b-8a8d-4f7dd44f7547-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.845274 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-run\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.845458 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27c10bcf-0231-449b-8a8d-4f7dd44f7547-logs\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.845656 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.849621 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27c10bcf-0231-449b-8a8d-4f7dd44f7547-scripts\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.853035 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c10bcf-0231-449b-8a8d-4f7dd44f7547-config-data\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.862341 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98msr\" (UniqueName: \"kubernetes.io/projected/27c10bcf-0231-449b-8a8d-4f7dd44f7547-kube-api-access-98msr\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.880476 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-2\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.947413 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.947494 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.947562 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.947599 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.947632 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-run\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.947669 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-sys\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.947699 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-dev\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.947740 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.947797 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948026 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b8ccc32-b665-4c4a-bece-bda801b97ba8-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948096 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-dev\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948208 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0982b9c9-87f9-40a1-b776-1e889e04caa4-scripts\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948337 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0982b9c9-87f9-40a1-b776-1e889e04caa4-config-data\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948397 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948455 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8ccc32-b665-4c4a-bece-bda801b97ba8-scripts\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948509 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t92x2\" (UniqueName: \"kubernetes.io/projected/0982b9c9-87f9-40a1-b776-1e889e04caa4-kube-api-access-t92x2\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948555 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948630 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-run\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948705 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948759 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-sys\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948870 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948912 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b8ccc32-b665-4c4a-bece-bda801b97ba8-logs\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948962 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.948987 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0982b9c9-87f9-40a1-b776-1e889e04caa4-logs\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.949037 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.949065 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-887h9\" (UniqueName: \"kubernetes.io/projected/1b8ccc32-b665-4c4a-bece-bda801b97ba8-kube-api-access-887h9\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.949116 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8ccc32-b665-4c4a-bece-bda801b97ba8-config-data\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.949203 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0982b9c9-87f9-40a1-b776-1e889e04caa4-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.949863 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0982b9c9-87f9-40a1-b776-1e889e04caa4-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.949939 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.950002 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.950798 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.951005 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.956780 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.957405 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.958003 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.958072 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-run\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.958116 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-sys\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.958157 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-dev\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.958469 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8ccc32-b665-4c4a-bece-bda801b97ba8-scripts\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.959309 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.960930 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.961031 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.961416 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.961650 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.961721 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-run\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.962050 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0982b9c9-87f9-40a1-b776-1e889e04caa4-logs\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.962291 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-sys\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.962814 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b8ccc32-b665-4c4a-bece-bda801b97ba8-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.962688 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b8ccc32-b665-4c4a-bece-bda801b97ba8-logs\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.968069 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0982b9c9-87f9-40a1-b776-1e889e04caa4-scripts\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.968219 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-dev\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.970365 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8ccc32-b665-4c4a-bece-bda801b97ba8-config-data\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.972090 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0982b9c9-87f9-40a1-b776-1e889e04caa4-config-data\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.982059 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t92x2\" (UniqueName: \"kubernetes.io/projected/0982b9c9-87f9-40a1-b776-1e889e04caa4-kube-api-access-t92x2\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.988410 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.989662 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:22 crc kubenswrapper[4689]: I0307 04:40:22.994521 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-887h9\" (UniqueName: \"kubernetes.io/projected/1b8ccc32-b665-4c4a-bece-bda801b97ba8-kube-api-access-887h9\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:23 crc kubenswrapper[4689]: I0307 04:40:23.002693 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-1\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:23 crc kubenswrapper[4689]: I0307 04:40:23.024561 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-2\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:23 crc kubenswrapper[4689]: I0307 04:40:23.127052 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:23 crc kubenswrapper[4689]: I0307 04:40:23.199225 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:23 crc kubenswrapper[4689]: I0307 04:40:23.211077 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:23 crc kubenswrapper[4689]: I0307 04:40:23.358095 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:40:23 crc kubenswrapper[4689]: W0307 04:40:23.360114 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c0a16f2_251d_4b9c_a03c_336d12a54add.slice/crio-0891b81225da304122ad3187b865fe33c1c251ffa5c184ce27b76caaff4a70d9 WatchSource:0}: Error finding container 0891b81225da304122ad3187b865fe33c1c251ffa5c184ce27b76caaff4a70d9: Status 404 returned error can't find the container with id 0891b81225da304122ad3187b865fe33c1c251ffa5c184ce27b76caaff4a70d9 Mar 07 04:40:23 crc kubenswrapper[4689]: I0307 04:40:23.469061 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"4c0a16f2-251d-4b9c-a03c-336d12a54add","Type":"ContainerStarted","Data":"0891b81225da304122ad3187b865fe33c1c251ffa5c184ce27b76caaff4a70d9"} Mar 07 04:40:23 crc kubenswrapper[4689]: I0307 04:40:23.615823 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Mar 07 04:40:23 crc kubenswrapper[4689]: W0307 04:40:23.618248 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27c10bcf_0231_449b_8a8d_4f7dd44f7547.slice/crio-1ad770038b78d1a65cda73d1e2e1855b2ead24beb8c5eab46c743de4c51bfa29 WatchSource:0}: Error finding container 1ad770038b78d1a65cda73d1e2e1855b2ead24beb8c5eab46c743de4c51bfa29: Status 404 returned error can't find the container with id 1ad770038b78d1a65cda73d1e2e1855b2ead24beb8c5eab46c743de4c51bfa29 Mar 07 04:40:23 crc kubenswrapper[4689]: I0307 04:40:23.742119 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Mar 07 04:40:23 crc kubenswrapper[4689]: I0307 04:40:23.750732 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:40:23 crc kubenswrapper[4689]: W0307 04:40:23.750839 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b8ccc32_b665_4c4a_bece_bda801b97ba8.slice/crio-e8899ae5b3d6c26e6e49d5d1f982784cc4b8de0cd01c63345a0eef57d2f6ed60 WatchSource:0}: Error finding container e8899ae5b3d6c26e6e49d5d1f982784cc4b8de0cd01c63345a0eef57d2f6ed60: Status 404 returned error can't find the container with id e8899ae5b3d6c26e6e49d5d1f982784cc4b8de0cd01c63345a0eef57d2f6ed60 Mar 07 04:40:23 crc kubenswrapper[4689]: W0307 04:40:23.755614 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0982b9c9_87f9_40a1_b776_1e889e04caa4.slice/crio-d8944146cbcb382d5d3618dc99811769abcbc81ac9b47857e4b41a511dfbb66f WatchSource:0}: Error finding container d8944146cbcb382d5d3618dc99811769abcbc81ac9b47857e4b41a511dfbb66f: Status 404 returned error can't find the container with id d8944146cbcb382d5d3618dc99811769abcbc81ac9b47857e4b41a511dfbb66f Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.492010 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"0982b9c9-87f9-40a1-b776-1e889e04caa4","Type":"ContainerStarted","Data":"fffa631dedda68e9081b9715d9fb8e0eea97fa5c2184e1bc227ba0ac0d8ecf26"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.492588 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"0982b9c9-87f9-40a1-b776-1e889e04caa4","Type":"ContainerStarted","Data":"9f526e39ad786fb7107204aaebc9f4fca1c7ff6754c30d170abb12b687553462"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.492607 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"0982b9c9-87f9-40a1-b776-1e889e04caa4","Type":"ContainerStarted","Data":"3598a27cce744ba4368f8c3021a5a2979509933cb2d1c429bade24025ba6b037"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.492619 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"0982b9c9-87f9-40a1-b776-1e889e04caa4","Type":"ContainerStarted","Data":"d8944146cbcb382d5d3618dc99811769abcbc81ac9b47857e4b41a511dfbb66f"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.495667 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1b8ccc32-b665-4c4a-bece-bda801b97ba8","Type":"ContainerStarted","Data":"07f4ba9781ab1a44d03850cb77daeeefcb63ec84b00b669ace1cd2914c5f6801"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.495705 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1b8ccc32-b665-4c4a-bece-bda801b97ba8","Type":"ContainerStarted","Data":"4e114ae4a5af7e5b561bba4f555c4a98852ffd20be449e5b69bede0c396f39b5"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.495716 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1b8ccc32-b665-4c4a-bece-bda801b97ba8","Type":"ContainerStarted","Data":"303084ff304c4049d9965d75aabee6bf368d9e556243c9f65468ad9a8427ca2d"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.495727 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1b8ccc32-b665-4c4a-bece-bda801b97ba8","Type":"ContainerStarted","Data":"e8899ae5b3d6c26e6e49d5d1f982784cc4b8de0cd01c63345a0eef57d2f6ed60"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.497906 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"4c0a16f2-251d-4b9c-a03c-336d12a54add","Type":"ContainerStarted","Data":"a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.497935 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"4c0a16f2-251d-4b9c-a03c-336d12a54add","Type":"ContainerStarted","Data":"a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.497948 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"4c0a16f2-251d-4b9c-a03c-336d12a54add","Type":"ContainerStarted","Data":"9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.500418 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"27c10bcf-0231-449b-8a8d-4f7dd44f7547","Type":"ContainerStarted","Data":"953fed04805b9fa8dc17e1d3a91063aad5de1fb0c53dcb17ca5918320c49bbac"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.500447 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"27c10bcf-0231-449b-8a8d-4f7dd44f7547","Type":"ContainerStarted","Data":"572ffdb4e47cffc557664180d047863876402926745fb197765ba4cd1b2b7d67"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.500459 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"27c10bcf-0231-449b-8a8d-4f7dd44f7547","Type":"ContainerStarted","Data":"ab525ee23d41d9c43242b85a9161e0dc019a3c75c0d9019ef0e09b7b41faaad6"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.500470 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"27c10bcf-0231-449b-8a8d-4f7dd44f7547","Type":"ContainerStarted","Data":"1ad770038b78d1a65cda73d1e2e1855b2ead24beb8c5eab46c743de4c51bfa29"} Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.514627 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.5145780049999997 podStartE2EDuration="3.514578005s" podCreationTimestamp="2026-03-07 04:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:40:24.512725394 +0000 UTC m=+1269.559108883" watchObservedRunningTime="2026-03-07 04:40:24.514578005 +0000 UTC m=+1269.560961504" Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.537827 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.53780637 podStartE2EDuration="3.53780637s" podCreationTimestamp="2026-03-07 04:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:40:24.533105733 +0000 UTC m=+1269.579489232" watchObservedRunningTime="2026-03-07 04:40:24.53780637 +0000 UTC m=+1269.584189869" Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.567136 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=3.567120669 podStartE2EDuration="3.567120669s" podCreationTimestamp="2026-03-07 04:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:40:24.563368929 +0000 UTC m=+1269.609752418" watchObservedRunningTime="2026-03-07 04:40:24.567120669 +0000 UTC m=+1269.613504158" Mar 07 04:40:24 crc kubenswrapper[4689]: I0307 04:40:24.600937 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.600915009 podStartE2EDuration="3.600915009s" podCreationTimestamp="2026-03-07 04:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:40:24.593340095 +0000 UTC m=+1269.639723594" watchObservedRunningTime="2026-03-07 04:40:24.600915009 +0000 UTC m=+1269.647298508" Mar 07 04:40:32 crc kubenswrapper[4689]: I0307 04:40:32.840736 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:32 crc kubenswrapper[4689]: I0307 04:40:32.842546 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:32 crc kubenswrapper[4689]: I0307 04:40:32.842667 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:32 crc kubenswrapper[4689]: I0307 04:40:32.864256 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:32 crc kubenswrapper[4689]: I0307 04:40:32.870247 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:32 crc kubenswrapper[4689]: I0307 04:40:32.876632 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.127302 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.127355 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.127368 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.158919 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.175787 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.201964 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.202024 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.202511 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.212221 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.212325 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.213006 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.219354 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.247996 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.250620 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.257643 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.261489 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.270470 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.274394 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.597315 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.598494 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.598557 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.598586 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.598610 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.598634 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.598659 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.598683 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.598707 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.598732 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.598755 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.598776 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.609879 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.610718 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.610823 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.612815 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.613268 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.613882 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.616495 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.617276 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.620282 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.624223 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.625224 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:33 crc kubenswrapper[4689]: I0307 04:40:33.626042 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:35 crc kubenswrapper[4689]: I0307 04:40:35.104564 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Mar 07 04:40:35 crc kubenswrapper[4689]: I0307 04:40:35.113026 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:40:35 crc kubenswrapper[4689]: I0307 04:40:35.287496 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Mar 07 04:40:35 crc kubenswrapper[4689]: I0307 04:40:35.324372 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:40:36 crc kubenswrapper[4689]: I0307 04:40:36.614644 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerName="glance-log" containerID="cri-o://9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439" gracePeriod=30 Mar 07 04:40:36 crc kubenswrapper[4689]: I0307 04:40:36.615037 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerName="glance-api" containerID="cri-o://a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea" gracePeriod=30 Mar 07 04:40:36 crc kubenswrapper[4689]: I0307 04:40:36.615111 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerName="glance-log" containerID="cri-o://3598a27cce744ba4368f8c3021a5a2979509933cb2d1c429bade24025ba6b037" gracePeriod=30 Mar 07 04:40:36 crc kubenswrapper[4689]: I0307 04:40:36.615110 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerName="glance-httpd" containerID="cri-o://a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e" gracePeriod=30 Mar 07 04:40:36 crc kubenswrapper[4689]: I0307 04:40:36.615236 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerName="glance-api" containerID="cri-o://fffa631dedda68e9081b9715d9fb8e0eea97fa5c2184e1bc227ba0ac0d8ecf26" gracePeriod=30 Mar 07 04:40:36 crc kubenswrapper[4689]: I0307 04:40:36.615281 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerName="glance-httpd" containerID="cri-o://9f526e39ad786fb7107204aaebc9f4fca1c7ff6754c30d170abb12b687553462" gracePeriod=30 Mar 07 04:40:36 crc kubenswrapper[4689]: I0307 04:40:36.615497 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerName="glance-log" containerID="cri-o://303084ff304c4049d9965d75aabee6bf368d9e556243c9f65468ad9a8427ca2d" gracePeriod=30 Mar 07 04:40:36 crc kubenswrapper[4689]: I0307 04:40:36.615635 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerName="glance-api" containerID="cri-o://07f4ba9781ab1a44d03850cb77daeeefcb63ec84b00b669ace1cd2914c5f6801" gracePeriod=30 Mar 07 04:40:36 crc kubenswrapper[4689]: I0307 04:40:36.615643 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerName="glance-httpd" containerID="cri-o://4e114ae4a5af7e5b561bba4f555c4a98852ffd20be449e5b69bede0c396f39b5" gracePeriod=30 Mar 07 04:40:36 crc kubenswrapper[4689]: I0307 04:40:36.615813 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerName="glance-log" containerID="cri-o://ab525ee23d41d9c43242b85a9161e0dc019a3c75c0d9019ef0e09b7b41faaad6" gracePeriod=30 Mar 07 04:40:36 crc kubenswrapper[4689]: I0307 04:40:36.616039 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerName="glance-api" containerID="cri-o://953fed04805b9fa8dc17e1d3a91063aad5de1fb0c53dcb17ca5918320c49bbac" gracePeriod=30 Mar 07 04:40:36 crc kubenswrapper[4689]: I0307 04:40:36.616133 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerName="glance-httpd" containerID="cri-o://572ffdb4e47cffc557664180d047863876402926745fb197765ba4cd1b2b7d67" gracePeriod=30 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.598369 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.623346 4689 generic.go:334] "Generic (PLEG): container finished" podID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerID="953fed04805b9fa8dc17e1d3a91063aad5de1fb0c53dcb17ca5918320c49bbac" exitCode=0 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.623370 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"27c10bcf-0231-449b-8a8d-4f7dd44f7547","Type":"ContainerDied","Data":"953fed04805b9fa8dc17e1d3a91063aad5de1fb0c53dcb17ca5918320c49bbac"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.623405 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"27c10bcf-0231-449b-8a8d-4f7dd44f7547","Type":"ContainerDied","Data":"572ffdb4e47cffc557664180d047863876402926745fb197765ba4cd1b2b7d67"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.623379 4689 generic.go:334] "Generic (PLEG): container finished" podID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerID="572ffdb4e47cffc557664180d047863876402926745fb197765ba4cd1b2b7d67" exitCode=0 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.623424 4689 generic.go:334] "Generic (PLEG): container finished" podID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerID="ab525ee23d41d9c43242b85a9161e0dc019a3c75c0d9019ef0e09b7b41faaad6" exitCode=143 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.623473 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"27c10bcf-0231-449b-8a8d-4f7dd44f7547","Type":"ContainerDied","Data":"ab525ee23d41d9c43242b85a9161e0dc019a3c75c0d9019ef0e09b7b41faaad6"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.642955 4689 generic.go:334] "Generic (PLEG): container finished" podID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerID="fffa631dedda68e9081b9715d9fb8e0eea97fa5c2184e1bc227ba0ac0d8ecf26" exitCode=0 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.642986 4689 generic.go:334] "Generic (PLEG): container finished" podID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerID="9f526e39ad786fb7107204aaebc9f4fca1c7ff6754c30d170abb12b687553462" exitCode=0 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.642994 4689 generic.go:334] "Generic (PLEG): container finished" podID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerID="3598a27cce744ba4368f8c3021a5a2979509933cb2d1c429bade24025ba6b037" exitCode=143 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.643034 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"0982b9c9-87f9-40a1-b776-1e889e04caa4","Type":"ContainerDied","Data":"fffa631dedda68e9081b9715d9fb8e0eea97fa5c2184e1bc227ba0ac0d8ecf26"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.643060 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"0982b9c9-87f9-40a1-b776-1e889e04caa4","Type":"ContainerDied","Data":"9f526e39ad786fb7107204aaebc9f4fca1c7ff6754c30d170abb12b687553462"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.643069 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"0982b9c9-87f9-40a1-b776-1e889e04caa4","Type":"ContainerDied","Data":"3598a27cce744ba4368f8c3021a5a2979509933cb2d1c429bade24025ba6b037"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.644977 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-etc-iscsi\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645020 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645112 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c0a16f2-251d-4b9c-a03c-336d12a54add-scripts\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645142 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-dev\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645109 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645198 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-run\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645252 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-run" (OuterVolumeSpecName: "run") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645268 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-dev" (OuterVolumeSpecName: "dev") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645300 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qpr8\" (UniqueName: \"kubernetes.io/projected/4c0a16f2-251d-4b9c-a03c-336d12a54add-kube-api-access-8qpr8\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645337 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-sys\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645358 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-var-locks-brick\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645393 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-sys" (OuterVolumeSpecName: "sys") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645407 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c0a16f2-251d-4b9c-a03c-336d12a54add-httpd-run\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645426 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645433 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0a16f2-251d-4b9c-a03c-336d12a54add-config-data\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645463 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645527 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-lib-modules\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645566 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-etc-nvme\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645592 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0a16f2-251d-4b9c-a03c-336d12a54add-logs\") pod \"4c0a16f2-251d-4b9c-a03c-336d12a54add\" (UID: \"4c0a16f2-251d-4b9c-a03c-336d12a54add\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645723 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.645807 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0a16f2-251d-4b9c-a03c-336d12a54add-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646136 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646195 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646213 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646228 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646241 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c0a16f2-251d-4b9c-a03c-336d12a54add-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646252 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646264 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646276 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646662 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0a16f2-251d-4b9c-a03c-336d12a54add-logs" (OuterVolumeSpecName: "logs") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646658 4689 generic.go:334] "Generic (PLEG): container finished" podID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerID="07f4ba9781ab1a44d03850cb77daeeefcb63ec84b00b669ace1cd2914c5f6801" exitCode=0 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646685 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1b8ccc32-b665-4c4a-bece-bda801b97ba8","Type":"ContainerDied","Data":"07f4ba9781ab1a44d03850cb77daeeefcb63ec84b00b669ace1cd2914c5f6801"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646695 4689 generic.go:334] "Generic (PLEG): container finished" podID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerID="4e114ae4a5af7e5b561bba4f555c4a98852ffd20be449e5b69bede0c396f39b5" exitCode=0 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646713 4689 generic.go:334] "Generic (PLEG): container finished" podID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerID="303084ff304c4049d9965d75aabee6bf368d9e556243c9f65468ad9a8427ca2d" exitCode=143 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646719 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1b8ccc32-b665-4c4a-bece-bda801b97ba8","Type":"ContainerDied","Data":"4e114ae4a5af7e5b561bba4f555c4a98852ffd20be449e5b69bede0c396f39b5"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.646735 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1b8ccc32-b665-4c4a-bece-bda801b97ba8","Type":"ContainerDied","Data":"303084ff304c4049d9965d75aabee6bf368d9e556243c9f65468ad9a8427ca2d"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.650918 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0a16f2-251d-4b9c-a03c-336d12a54add-kube-api-access-8qpr8" (OuterVolumeSpecName: "kube-api-access-8qpr8") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "kube-api-access-8qpr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.651192 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0a16f2-251d-4b9c-a03c-336d12a54add-scripts" (OuterVolumeSpecName: "scripts") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.654611 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.654600 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.658033 4689 generic.go:334] "Generic (PLEG): container finished" podID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerID="a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea" exitCode=0 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.658063 4689 generic.go:334] "Generic (PLEG): container finished" podID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerID="a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e" exitCode=0 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.658072 4689 generic.go:334] "Generic (PLEG): container finished" podID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerID="9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439" exitCode=143 Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.658093 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"4c0a16f2-251d-4b9c-a03c-336d12a54add","Type":"ContainerDied","Data":"a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.658158 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"4c0a16f2-251d-4b9c-a03c-336d12a54add","Type":"ContainerDied","Data":"a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.658206 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"4c0a16f2-251d-4b9c-a03c-336d12a54add","Type":"ContainerDied","Data":"9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.658216 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"4c0a16f2-251d-4b9c-a03c-336d12a54add","Type":"ContainerDied","Data":"0891b81225da304122ad3187b865fe33c1c251ffa5c184ce27b76caaff4a70d9"} Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.658231 4689 scope.go:117] "RemoveContainer" containerID="a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.658356 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.683226 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.689782 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.692250 4689 scope.go:117] "RemoveContainer" containerID="a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.732757 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0a16f2-251d-4b9c-a03c-336d12a54add-config-data" (OuterVolumeSpecName: "config-data") pod "4c0a16f2-251d-4b9c-a03c-336d12a54add" (UID: "4c0a16f2-251d-4b9c-a03c-336d12a54add"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.743524 4689 scope.go:117] "RemoveContainer" containerID="9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.746821 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-var-locks-brick\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.746860 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27c10bcf-0231-449b-8a8d-4f7dd44f7547-logs\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.746878 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.746914 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c10bcf-0231-449b-8a8d-4f7dd44f7547-config-data\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.746932 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-etc-iscsi\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.746950 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t92x2\" (UniqueName: \"kubernetes.io/projected/0982b9c9-87f9-40a1-b776-1e889e04caa4-kube-api-access-t92x2\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.746970 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.746955 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.746989 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-dev\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747007 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-sys\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747034 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27c10bcf-0231-449b-8a8d-4f7dd44f7547-scripts\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747072 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0982b9c9-87f9-40a1-b776-1e889e04caa4-scripts\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747095 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-dev\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747113 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-run\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747134 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0982b9c9-87f9-40a1-b776-1e889e04caa4-httpd-run\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747149 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-lib-modules\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747195 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-var-locks-brick\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747215 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98msr\" (UniqueName: \"kubernetes.io/projected/27c10bcf-0231-449b-8a8d-4f7dd44f7547-kube-api-access-98msr\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747242 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-run\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747257 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747271 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0982b9c9-87f9-40a1-b776-1e889e04caa4-logs\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747284 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747297 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-lib-modules\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747318 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0982b9c9-87f9-40a1-b776-1e889e04caa4-config-data\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747352 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27c10bcf-0231-449b-8a8d-4f7dd44f7547-httpd-run\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747370 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-etc-iscsi\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747399 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-etc-nvme\") pod \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\" (UID: \"27c10bcf-0231-449b-8a8d-4f7dd44f7547\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747413 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-etc-nvme\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747438 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-sys\") pod \"0982b9c9-87f9-40a1-b776-1e889e04caa4\" (UID: \"0982b9c9-87f9-40a1-b776-1e889e04caa4\") " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747688 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qpr8\" (UniqueName: \"kubernetes.io/projected/4c0a16f2-251d-4b9c-a03c-336d12a54add-kube-api-access-8qpr8\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747721 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0a16f2-251d-4b9c-a03c-336d12a54add-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747746 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747760 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4c0a16f2-251d-4b9c-a03c-336d12a54add-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747772 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c0a16f2-251d-4b9c-a03c-336d12a54add-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747789 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747799 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.747810 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c0a16f2-251d-4b9c-a03c-336d12a54add-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.750015 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27c10bcf-0231-449b-8a8d-4f7dd44f7547-scripts" (OuterVolumeSpecName: "scripts") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.750062 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-dev" (OuterVolumeSpecName: "dev") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.750088 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-sys" (OuterVolumeSpecName: "sys") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.750150 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.750400 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27c10bcf-0231-449b-8a8d-4f7dd44f7547-logs" (OuterVolumeSpecName: "logs") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.750734 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0982b9c9-87f9-40a1-b776-1e889e04caa4-logs" (OuterVolumeSpecName: "logs") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.751087 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.751142 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-dev" (OuterVolumeSpecName: "dev") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.751161 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-run" (OuterVolumeSpecName: "run") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.751408 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0982b9c9-87f9-40a1-b776-1e889e04caa4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.751434 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.752861 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0982b9c9-87f9-40a1-b776-1e889e04caa4-kube-api-access-t92x2" (OuterVolumeSpecName: "kube-api-access-t92x2") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "kube-api-access-t92x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.752945 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-run" (OuterVolumeSpecName: "run") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.752976 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.753721 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0982b9c9-87f9-40a1-b776-1e889e04caa4-scripts" (OuterVolumeSpecName: "scripts") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.754925 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.754937 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-sys" (OuterVolumeSpecName: "sys") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.754957 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.754968 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.755201 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27c10bcf-0231-449b-8a8d-4f7dd44f7547-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.755877 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27c10bcf-0231-449b-8a8d-4f7dd44f7547-kube-api-access-98msr" (OuterVolumeSpecName: "kube-api-access-98msr") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "kube-api-access-98msr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.763327 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.764187 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.764246 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance-cache") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.768734 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.769259 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance-cache") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.775740 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851152 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0982b9c9-87f9-40a1-b776-1e889e04caa4-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851441 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851451 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851459 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851468 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0982b9c9-87f9-40a1-b776-1e889e04caa4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851476 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851487 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98msr\" (UniqueName: \"kubernetes.io/projected/27c10bcf-0231-449b-8a8d-4f7dd44f7547-kube-api-access-98msr\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851496 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851503 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851523 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851531 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0982b9c9-87f9-40a1-b776-1e889e04caa4-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851544 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851557 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851566 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27c10bcf-0231-449b-8a8d-4f7dd44f7547-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851575 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851583 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851591 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851599 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851607 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851614 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27c10bcf-0231-449b-8a8d-4f7dd44f7547-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851628 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851637 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851648 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t92x2\" (UniqueName: \"kubernetes.io/projected/0982b9c9-87f9-40a1-b776-1e889e04caa4-kube-api-access-t92x2\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851660 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851669 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0982b9c9-87f9-40a1-b776-1e889e04caa4-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851677 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27c10bcf-0231-449b-8a8d-4f7dd44f7547-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.851684 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27c10bcf-0231-449b-8a8d-4f7dd44f7547-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.865392 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0982b9c9-87f9-40a1-b776-1e889e04caa4-config-data" (OuterVolumeSpecName: "config-data") pod "0982b9c9-87f9-40a1-b776-1e889e04caa4" (UID: "0982b9c9-87f9-40a1-b776-1e889e04caa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.865819 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.884760 4689 scope.go:117] "RemoveContainer" containerID="a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea" Mar 07 04:40:37 crc kubenswrapper[4689]: E0307 04:40:37.885328 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea\": container with ID starting with a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea not found: ID does not exist" containerID="a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.885396 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea"} err="failed to get container status \"a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea\": rpc error: code = NotFound desc = could not find container \"a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea\": container with ID starting with a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea not found: ID does not exist" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.885429 4689 scope.go:117] "RemoveContainer" containerID="a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e" Mar 07 04:40:37 crc kubenswrapper[4689]: E0307 04:40:37.885990 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e\": container with ID starting with a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e not found: ID does not exist" containerID="a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.886256 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e"} err="failed to get container status \"a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e\": rpc error: code = NotFound desc = could not find container \"a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e\": container with ID starting with a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e not found: ID does not exist" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.886319 4689 scope.go:117] "RemoveContainer" containerID="9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.889119 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 07 04:40:37 crc kubenswrapper[4689]: E0307 04:40:37.889557 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439\": container with ID starting with 9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439 not found: ID does not exist" containerID="9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.889810 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439"} err="failed to get container status \"9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439\": rpc error: code = NotFound desc = could not find container \"9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439\": container with ID starting with 9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439 not found: ID does not exist" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.889878 4689 scope.go:117] "RemoveContainer" containerID="a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.893552 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea"} err="failed to get container status \"a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea\": rpc error: code = NotFound desc = could not find container \"a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea\": container with ID starting with a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea not found: ID does not exist" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.893603 4689 scope.go:117] "RemoveContainer" containerID="a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.895151 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e"} err="failed to get container status \"a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e\": rpc error: code = NotFound desc = could not find container \"a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e\": container with ID starting with a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e not found: ID does not exist" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.895206 4689 scope.go:117] "RemoveContainer" containerID="9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.895397 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439"} err="failed to get container status \"9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439\": rpc error: code = NotFound desc = could not find container \"9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439\": container with ID starting with 9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439 not found: ID does not exist" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.895415 4689 scope.go:117] "RemoveContainer" containerID="a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.895568 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea"} err="failed to get container status \"a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea\": rpc error: code = NotFound desc = could not find container \"a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea\": container with ID starting with a65bdc495e3e6024fce25fb624837a254f9b125832afe7f2d772f031938cc0ea not found: ID does not exist" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.895584 4689 scope.go:117] "RemoveContainer" containerID="a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.895725 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e"} err="failed to get container status \"a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e\": rpc error: code = NotFound desc = could not find container \"a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e\": container with ID starting with a4829f513f7c5dfc4fec2dd791759a7c23408945b334372da3ef6394f7aa3b2e not found: ID does not exist" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.895741 4689 scope.go:117] "RemoveContainer" containerID="9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.895784 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.896033 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439"} err="failed to get container status \"9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439\": rpc error: code = NotFound desc = could not find container \"9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439\": container with ID starting with 9837123fbb8e02a5cbce01a1436eca526bd0b04add8bcc8eb47468ed940ff439 not found: ID does not exist" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.896578 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27c10bcf-0231-449b-8a8d-4f7dd44f7547-config-data" (OuterVolumeSpecName: "config-data") pod "27c10bcf-0231-449b-8a8d-4f7dd44f7547" (UID: "27c10bcf-0231-449b-8a8d-4f7dd44f7547"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.898596 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.948007 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.953462 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.953489 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c10bcf-0231-449b-8a8d-4f7dd44f7547-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.953501 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.953509 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.953517 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:37 crc kubenswrapper[4689]: I0307 04:40:37.953525 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0982b9c9-87f9-40a1-b776-1e889e04caa4-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.009743 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.015466 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054365 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054402 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-dev\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054432 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-sys\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054458 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-887h9\" (UniqueName: \"kubernetes.io/projected/1b8ccc32-b665-4c4a-bece-bda801b97ba8-kube-api-access-887h9\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054473 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-etc-nvme\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054504 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8ccc32-b665-4c4a-bece-bda801b97ba8-scripts\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054520 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b8ccc32-b665-4c4a-bece-bda801b97ba8-httpd-run\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054534 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-lib-modules\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054525 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-sys" (OuterVolumeSpecName: "sys") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054567 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054594 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-dev" (OuterVolumeSpecName: "dev") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054627 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054754 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054840 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b8ccc32-b665-4c4a-bece-bda801b97ba8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.054560 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-etc-iscsi\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055027 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b8ccc32-b665-4c4a-bece-bda801b97ba8-logs\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055051 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055164 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8ccc32-b665-4c4a-bece-bda801b97ba8-config-data\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055207 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-run\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055247 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-var-locks-brick\") pod \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\" (UID: \"1b8ccc32-b665-4c4a-bece-bda801b97ba8\") " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055358 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b8ccc32-b665-4c4a-bece-bda801b97ba8-logs" (OuterVolumeSpecName: "logs") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055555 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055573 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055585 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b8ccc32-b665-4c4a-bece-bda801b97ba8-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055596 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055605 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055615 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b8ccc32-b665-4c4a-bece-bda801b97ba8-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055626 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055657 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.055684 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-run" (OuterVolumeSpecName: "run") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.057631 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.057855 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b8ccc32-b665-4c4a-bece-bda801b97ba8-kube-api-access-887h9" (OuterVolumeSpecName: "kube-api-access-887h9") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "kube-api-access-887h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.057988 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8ccc32-b665-4c4a-bece-bda801b97ba8-scripts" (OuterVolumeSpecName: "scripts") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.059129 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.120936 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8ccc32-b665-4c4a-bece-bda801b97ba8-config-data" (OuterVolumeSpecName: "config-data") pod "1b8ccc32-b665-4c4a-bece-bda801b97ba8" (UID: "1b8ccc32-b665-4c4a-bece-bda801b97ba8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.157269 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8ccc32-b665-4c4a-bece-bda801b97ba8-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.157306 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.157317 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1b8ccc32-b665-4c4a-bece-bda801b97ba8-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.157369 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.157385 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-887h9\" (UniqueName: \"kubernetes.io/projected/1b8ccc32-b665-4c4a-bece-bda801b97ba8-kube-api-access-887h9\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.157398 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8ccc32-b665-4c4a-bece-bda801b97ba8-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.157422 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.170536 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.179303 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.259658 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.259692 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.674301 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"0982b9c9-87f9-40a1-b776-1e889e04caa4","Type":"ContainerDied","Data":"d8944146cbcb382d5d3618dc99811769abcbc81ac9b47857e4b41a511dfbb66f"} Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.674367 4689 scope.go:117] "RemoveContainer" containerID="fffa631dedda68e9081b9715d9fb8e0eea97fa5c2184e1bc227ba0ac0d8ecf26" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.674420 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.680879 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1b8ccc32-b665-4c4a-bece-bda801b97ba8","Type":"ContainerDied","Data":"e8899ae5b3d6c26e6e49d5d1f982784cc4b8de0cd01c63345a0eef57d2f6ed60"} Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.680973 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.687342 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"27c10bcf-0231-449b-8a8d-4f7dd44f7547","Type":"ContainerDied","Data":"1ad770038b78d1a65cda73d1e2e1855b2ead24beb8c5eab46c743de4c51bfa29"} Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.687444 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.723369 4689 scope.go:117] "RemoveContainer" containerID="9f526e39ad786fb7107204aaebc9f4fca1c7ff6754c30d170abb12b687553462" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.755483 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.767405 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.774327 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.781830 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.794791 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.799789 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.805385 4689 scope.go:117] "RemoveContainer" containerID="3598a27cce744ba4368f8c3021a5a2979509933cb2d1c429bade24025ba6b037" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.823489 4689 scope.go:117] "RemoveContainer" containerID="07f4ba9781ab1a44d03850cb77daeeefcb63ec84b00b669ace1cd2914c5f6801" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.845827 4689 scope.go:117] "RemoveContainer" containerID="4e114ae4a5af7e5b561bba4f555c4a98852ffd20be449e5b69bede0c396f39b5" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.863300 4689 scope.go:117] "RemoveContainer" containerID="303084ff304c4049d9965d75aabee6bf368d9e556243c9f65468ad9a8427ca2d" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.882403 4689 scope.go:117] "RemoveContainer" containerID="953fed04805b9fa8dc17e1d3a91063aad5de1fb0c53dcb17ca5918320c49bbac" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.899958 4689 scope.go:117] "RemoveContainer" containerID="572ffdb4e47cffc557664180d047863876402926745fb197765ba4cd1b2b7d67" Mar 07 04:40:38 crc kubenswrapper[4689]: I0307 04:40:38.919269 4689 scope.go:117] "RemoveContainer" containerID="ab525ee23d41d9c43242b85a9161e0dc019a3c75c0d9019ef0e09b7b41faaad6" Mar 07 04:40:39 crc kubenswrapper[4689]: I0307 04:40:39.841694 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" path="/var/lib/kubelet/pods/0982b9c9-87f9-40a1-b776-1e889e04caa4/volumes" Mar 07 04:40:39 crc kubenswrapper[4689]: I0307 04:40:39.843827 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" path="/var/lib/kubelet/pods/1b8ccc32-b665-4c4a-bece-bda801b97ba8/volumes" Mar 07 04:40:39 crc kubenswrapper[4689]: I0307 04:40:39.846518 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" path="/var/lib/kubelet/pods/27c10bcf-0231-449b-8a8d-4f7dd44f7547/volumes" Mar 07 04:40:39 crc kubenswrapper[4689]: I0307 04:40:39.848049 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" path="/var/lib/kubelet/pods/4c0a16f2-251d-4b9c-a03c-336d12a54add/volumes" Mar 07 04:40:39 crc kubenswrapper[4689]: I0307 04:40:39.963067 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:40:39 crc kubenswrapper[4689]: I0307 04:40:39.963783 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerName="glance-log" containerID="cri-o://39698ded7b0a3ed528848dbb68451914b83d2e6550ad1187448aed1936996a0d" gracePeriod=30 Mar 07 04:40:39 crc kubenswrapper[4689]: I0307 04:40:39.964217 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerName="glance-api" containerID="cri-o://34f93fc3f5cf43acb412e66e235fad4f36a65fa810cf86bc11f803e334789008" gracePeriod=30 Mar 07 04:40:39 crc kubenswrapper[4689]: I0307 04:40:39.964447 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerName="glance-httpd" containerID="cri-o://54ecd246f2e60abf9a2d13aca438ca4411ec72e07783071da6d30e62c5b8a471" gracePeriod=30 Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.484986 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.485335 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" containerName="glance-log" containerID="cri-o://2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5" gracePeriod=30 Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.485724 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" containerName="glance-httpd" containerID="cri-o://efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a" gracePeriod=30 Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.485879 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" containerName="glance-api" containerID="cri-o://547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b" gracePeriod=30 Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.708619 4689 generic.go:334] "Generic (PLEG): container finished" podID="992376fa-f803-4a38-859a-3ddc5b52a191" containerID="2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5" exitCode=143 Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.708690 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"992376fa-f803-4a38-859a-3ddc5b52a191","Type":"ContainerDied","Data":"2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5"} Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.710883 4689 generic.go:334] "Generic (PLEG): container finished" podID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerID="34f93fc3f5cf43acb412e66e235fad4f36a65fa810cf86bc11f803e334789008" exitCode=0 Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.710905 4689 generic.go:334] "Generic (PLEG): container finished" podID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerID="54ecd246f2e60abf9a2d13aca438ca4411ec72e07783071da6d30e62c5b8a471" exitCode=0 Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.710931 4689 generic.go:334] "Generic (PLEG): container finished" podID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerID="39698ded7b0a3ed528848dbb68451914b83d2e6550ad1187448aed1936996a0d" exitCode=143 Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.710952 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"e5e79db1-aa6d-4206-ab3e-3f722931924d","Type":"ContainerDied","Data":"34f93fc3f5cf43acb412e66e235fad4f36a65fa810cf86bc11f803e334789008"} Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.711004 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"e5e79db1-aa6d-4206-ab3e-3f722931924d","Type":"ContainerDied","Data":"54ecd246f2e60abf9a2d13aca438ca4411ec72e07783071da6d30e62c5b8a471"} Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.711016 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"e5e79db1-aa6d-4206-ab3e-3f722931924d","Type":"ContainerDied","Data":"39698ded7b0a3ed528848dbb68451914b83d2e6550ad1187448aed1936996a0d"} Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.797849 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.900209 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e79db1-aa6d-4206-ab3e-3f722931924d-config-data\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.900276 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-dev\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.900326 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e79db1-aa6d-4206-ab3e-3f722931924d-scripts\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.900359 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-etc-iscsi\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.900381 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-etc-nvme\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.900420 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.900492 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-lib-modules\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.900518 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5e79db1-aa6d-4206-ab3e-3f722931924d-httpd-run\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.900547 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.900566 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-var-locks-brick\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.900629 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-run\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.900969 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5e79db1-aa6d-4206-ab3e-3f722931924d-logs\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.901022 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xtsf\" (UniqueName: \"kubernetes.io/projected/e5e79db1-aa6d-4206-ab3e-3f722931924d-kube-api-access-4xtsf\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.901042 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-sys\") pod \"e5e79db1-aa6d-4206-ab3e-3f722931924d\" (UID: \"e5e79db1-aa6d-4206-ab3e-3f722931924d\") " Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.902266 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.902761 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-run" (OuterVolumeSpecName: "run") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.902796 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.902883 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.902923 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-dev" (OuterVolumeSpecName: "dev") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.902960 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.903298 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5e79db1-aa6d-4206-ab3e-3f722931924d-logs" (OuterVolumeSpecName: "logs") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.903342 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-sys" (OuterVolumeSpecName: "sys") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.904214 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5e79db1-aa6d-4206-ab3e-3f722931924d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.906646 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance-cache") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.906650 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e79db1-aa6d-4206-ab3e-3f722931924d-kube-api-access-4xtsf" (OuterVolumeSpecName: "kube-api-access-4xtsf") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "kube-api-access-4xtsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.908459 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e79db1-aa6d-4206-ab3e-3f722931924d-scripts" (OuterVolumeSpecName: "scripts") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.909369 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:40 crc kubenswrapper[4689]: I0307 04:40:40.967132 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e79db1-aa6d-4206-ab3e-3f722931924d-config-data" (OuterVolumeSpecName: "config-data") pod "e5e79db1-aa6d-4206-ab3e-3f722931924d" (UID: "e5e79db1-aa6d-4206-ab3e-3f722931924d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002461 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002500 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5e79db1-aa6d-4206-ab3e-3f722931924d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002543 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002556 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002604 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002616 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5e79db1-aa6d-4206-ab3e-3f722931924d-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002627 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xtsf\" (UniqueName: \"kubernetes.io/projected/e5e79db1-aa6d-4206-ab3e-3f722931924d-kube-api-access-4xtsf\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002660 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002673 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e79db1-aa6d-4206-ab3e-3f722931924d-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002684 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002759 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e79db1-aa6d-4206-ab3e-3f722931924d-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002771 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002798 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e5e79db1-aa6d-4206-ab3e-3f722931924d-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.002814 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.017997 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.021059 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.104499 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.104527 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.310239 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407117 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-lib-modules\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407189 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmg9v\" (UniqueName: \"kubernetes.io/projected/992376fa-f803-4a38-859a-3ddc5b52a191-kube-api-access-pmg9v\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407212 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992376fa-f803-4a38-859a-3ddc5b52a191-scripts\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407227 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-dev\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407264 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-run\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407323 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/992376fa-f803-4a38-859a-3ddc5b52a191-httpd-run\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407251 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407351 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407282 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-dev" (OuterVolumeSpecName: "dev") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407365 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-var-locks-brick\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407302 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-run" (OuterVolumeSpecName: "run") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407377 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-etc-iscsi\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407392 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407402 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407429 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407430 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992376fa-f803-4a38-859a-3ddc5b52a191-config-data\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407502 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-sys\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407527 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-etc-nvme\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407545 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992376fa-f803-4a38-859a-3ddc5b52a191-logs\") pod \"992376fa-f803-4a38-859a-3ddc5b52a191\" (UID: \"992376fa-f803-4a38-859a-3ddc5b52a191\") " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407780 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-sys" (OuterVolumeSpecName: "sys") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407863 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407972 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.407983 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.408000 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.408007 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.408015 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.408022 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.408030 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/992376fa-f803-4a38-859a-3ddc5b52a191-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.408064 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/992376fa-f803-4a38-859a-3ddc5b52a191-logs" (OuterVolumeSpecName: "logs") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.408214 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/992376fa-f803-4a38-859a-3ddc5b52a191-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.411359 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992376fa-f803-4a38-859a-3ddc5b52a191-scripts" (OuterVolumeSpecName: "scripts") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.411366 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.411451 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/992376fa-f803-4a38-859a-3ddc5b52a191-kube-api-access-pmg9v" (OuterVolumeSpecName: "kube-api-access-pmg9v") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "kube-api-access-pmg9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.413268 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance-cache") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.470370 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992376fa-f803-4a38-859a-3ddc5b52a191-config-data" (OuterVolumeSpecName: "config-data") pod "992376fa-f803-4a38-859a-3ddc5b52a191" (UID: "992376fa-f803-4a38-859a-3ddc5b52a191"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.509601 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/992376fa-f803-4a38-859a-3ddc5b52a191-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.509659 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.509683 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.509692 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992376fa-f803-4a38-859a-3ddc5b52a191-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.509702 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992376fa-f803-4a38-859a-3ddc5b52a191-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.509710 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmg9v\" (UniqueName: \"kubernetes.io/projected/992376fa-f803-4a38-859a-3ddc5b52a191-kube-api-access-pmg9v\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.509719 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992376fa-f803-4a38-859a-3ddc5b52a191-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.522652 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.527790 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.610423 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.610455 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.724467 4689 generic.go:334] "Generic (PLEG): container finished" podID="992376fa-f803-4a38-859a-3ddc5b52a191" containerID="547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b" exitCode=0 Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.724699 4689 generic.go:334] "Generic (PLEG): container finished" podID="992376fa-f803-4a38-859a-3ddc5b52a191" containerID="efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a" exitCode=0 Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.724523 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"992376fa-f803-4a38-859a-3ddc5b52a191","Type":"ContainerDied","Data":"547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b"} Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.724543 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.724756 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"992376fa-f803-4a38-859a-3ddc5b52a191","Type":"ContainerDied","Data":"efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a"} Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.724781 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"992376fa-f803-4a38-859a-3ddc5b52a191","Type":"ContainerDied","Data":"947a912681b3b8dd37306772b392b6ec7d6c91e5ad787671e07d4353c4890062"} Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.724798 4689 scope.go:117] "RemoveContainer" containerID="547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.727186 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"e5e79db1-aa6d-4206-ab3e-3f722931924d","Type":"ContainerDied","Data":"b77799847633e1df76b9fc201abbd1e10bfc9d4ed78cf17a728b24fce5f5d0bd"} Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.727233 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.754211 4689 scope.go:117] "RemoveContainer" containerID="efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.757719 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.779269 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.786844 4689 scope.go:117] "RemoveContainer" containerID="2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.791192 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.802259 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.810505 4689 scope.go:117] "RemoveContainer" containerID="547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b" Mar 07 04:40:41 crc kubenswrapper[4689]: E0307 04:40:41.810987 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b\": container with ID starting with 547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b not found: ID does not exist" containerID="547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.811023 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b"} err="failed to get container status \"547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b\": rpc error: code = NotFound desc = could not find container \"547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b\": container with ID starting with 547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b not found: ID does not exist" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.811048 4689 scope.go:117] "RemoveContainer" containerID="efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a" Mar 07 04:40:41 crc kubenswrapper[4689]: E0307 04:40:41.811445 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a\": container with ID starting with efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a not found: ID does not exist" containerID="efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.811478 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a"} err="failed to get container status \"efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a\": rpc error: code = NotFound desc = could not find container \"efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a\": container with ID starting with efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a not found: ID does not exist" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.811498 4689 scope.go:117] "RemoveContainer" containerID="2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5" Mar 07 04:40:41 crc kubenswrapper[4689]: E0307 04:40:41.811947 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5\": container with ID starting with 2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5 not found: ID does not exist" containerID="2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.811978 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5"} err="failed to get container status \"2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5\": rpc error: code = NotFound desc = could not find container \"2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5\": container with ID starting with 2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5 not found: ID does not exist" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.811997 4689 scope.go:117] "RemoveContainer" containerID="547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.813034 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b"} err="failed to get container status \"547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b\": rpc error: code = NotFound desc = could not find container \"547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b\": container with ID starting with 547e728de1deb7237aa6d9d2a95ceab4f9826984f6057fcfcb9d471683b73b0b not found: ID does not exist" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.813277 4689 scope.go:117] "RemoveContainer" containerID="efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.813526 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a"} err="failed to get container status \"efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a\": rpc error: code = NotFound desc = could not find container \"efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a\": container with ID starting with efb239f87642c79d0204d1d3d66dcbf2c3685bcdc0a27cdeba0844a47e3d0c0a not found: ID does not exist" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.813555 4689 scope.go:117] "RemoveContainer" containerID="2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.813954 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5"} err="failed to get container status \"2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5\": rpc error: code = NotFound desc = could not find container \"2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5\": container with ID starting with 2998d3c2e66d3ae984f28a882725fb850d0e5c95b49f2c8f988e27c756136ab5 not found: ID does not exist" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.813980 4689 scope.go:117] "RemoveContainer" containerID="34f93fc3f5cf43acb412e66e235fad4f36a65fa810cf86bc11f803e334789008" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.836524 4689 scope.go:117] "RemoveContainer" containerID="54ecd246f2e60abf9a2d13aca438ca4411ec72e07783071da6d30e62c5b8a471" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.839414 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" path="/var/lib/kubelet/pods/992376fa-f803-4a38-859a-3ddc5b52a191/volumes" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.840457 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" path="/var/lib/kubelet/pods/e5e79db1-aa6d-4206-ab3e-3f722931924d/volumes" Mar 07 04:40:41 crc kubenswrapper[4689]: I0307 04:40:41.868480 4689 scope.go:117] "RemoveContainer" containerID="39698ded7b0a3ed528848dbb68451914b83d2e6550ad1187448aed1936996a0d" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.000914 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-tbm46"] Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.009400 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-tbm46"] Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.032209 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance9726-account-delete-chbgt"] Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.032726 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.032810 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.032927 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.032999 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.033077 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.033205 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.033298 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.033368 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.033444 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.033513 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.033589 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.033677 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.033764 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.033844 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.033918 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.033986 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.034059 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.034134 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.034267 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.034360 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.034454 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.034525 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.034601 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.034669 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.034845 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.034918 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.034992 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.035061 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.035136 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.035238 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.035322 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.035404 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.035477 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.035548 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: E0307 04:40:43.035623 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.035695 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.035942 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036019 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036094 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036189 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036266 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036338 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036427 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036500 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036573 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e79db1-aa6d-4206-ab3e-3f722931924d" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036647 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036724 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036796 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0a16f2-251d-4b9c-a03c-336d12a54add" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036868 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8ccc32-b665-4c4a-bece-bda801b97ba8" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.036982 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.037071 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="27c10bcf-0231-449b-8a8d-4f7dd44f7547" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.037186 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerName="glance-httpd" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.037267 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="992376fa-f803-4a38-859a-3ddc5b52a191" containerName="glance-log" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.037338 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="0982b9c9-87f9-40a1-b776-1e889e04caa4" containerName="glance-api" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.037948 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance9726-account-delete-chbgt" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.053472 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance9726-account-delete-chbgt"] Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.131985 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv85n\" (UniqueName: \"kubernetes.io/projected/413e722e-bb70-4638-983a-a1856e6ee4ec-kube-api-access-rv85n\") pod \"glance9726-account-delete-chbgt\" (UID: \"413e722e-bb70-4638-983a-a1856e6ee4ec\") " pod="glance-kuttl-tests/glance9726-account-delete-chbgt" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.132106 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413e722e-bb70-4638-983a-a1856e6ee4ec-operator-scripts\") pod \"glance9726-account-delete-chbgt\" (UID: \"413e722e-bb70-4638-983a-a1856e6ee4ec\") " pod="glance-kuttl-tests/glance9726-account-delete-chbgt" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.233352 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413e722e-bb70-4638-983a-a1856e6ee4ec-operator-scripts\") pod \"glance9726-account-delete-chbgt\" (UID: \"413e722e-bb70-4638-983a-a1856e6ee4ec\") " pod="glance-kuttl-tests/glance9726-account-delete-chbgt" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.233688 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv85n\" (UniqueName: \"kubernetes.io/projected/413e722e-bb70-4638-983a-a1856e6ee4ec-kube-api-access-rv85n\") pod \"glance9726-account-delete-chbgt\" (UID: \"413e722e-bb70-4638-983a-a1856e6ee4ec\") " pod="glance-kuttl-tests/glance9726-account-delete-chbgt" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.234878 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413e722e-bb70-4638-983a-a1856e6ee4ec-operator-scripts\") pod \"glance9726-account-delete-chbgt\" (UID: \"413e722e-bb70-4638-983a-a1856e6ee4ec\") " pod="glance-kuttl-tests/glance9726-account-delete-chbgt" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.250885 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv85n\" (UniqueName: \"kubernetes.io/projected/413e722e-bb70-4638-983a-a1856e6ee4ec-kube-api-access-rv85n\") pod \"glance9726-account-delete-chbgt\" (UID: \"413e722e-bb70-4638-983a-a1856e6ee4ec\") " pod="glance-kuttl-tests/glance9726-account-delete-chbgt" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.354873 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance9726-account-delete-chbgt" Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.579908 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance9726-account-delete-chbgt"] Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.744635 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance9726-account-delete-chbgt" event={"ID":"413e722e-bb70-4638-983a-a1856e6ee4ec","Type":"ContainerStarted","Data":"e05b109a693f99445447964f20deee86a915615e415fb5f778e9bfaab0a666ec"} Mar 07 04:40:43 crc kubenswrapper[4689]: I0307 04:40:43.838231 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="910303bb-e941-4f39-ab22-16b6d79339c6" path="/var/lib/kubelet/pods/910303bb-e941-4f39-ab22-16b6d79339c6/volumes" Mar 07 04:40:44 crc kubenswrapper[4689]: I0307 04:40:44.761458 4689 generic.go:334] "Generic (PLEG): container finished" podID="413e722e-bb70-4638-983a-a1856e6ee4ec" containerID="96ecd1348d8aedf7e7c82ab6d63a083426f803c2869df540471a05f43c70ebcf" exitCode=0 Mar 07 04:40:44 crc kubenswrapper[4689]: I0307 04:40:44.761972 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance9726-account-delete-chbgt" event={"ID":"413e722e-bb70-4638-983a-a1856e6ee4ec","Type":"ContainerDied","Data":"96ecd1348d8aedf7e7c82ab6d63a083426f803c2869df540471a05f43c70ebcf"} Mar 07 04:40:46 crc kubenswrapper[4689]: I0307 04:40:46.072663 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance9726-account-delete-chbgt" Mar 07 04:40:46 crc kubenswrapper[4689]: I0307 04:40:46.187723 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv85n\" (UniqueName: \"kubernetes.io/projected/413e722e-bb70-4638-983a-a1856e6ee4ec-kube-api-access-rv85n\") pod \"413e722e-bb70-4638-983a-a1856e6ee4ec\" (UID: \"413e722e-bb70-4638-983a-a1856e6ee4ec\") " Mar 07 04:40:46 crc kubenswrapper[4689]: I0307 04:40:46.187896 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413e722e-bb70-4638-983a-a1856e6ee4ec-operator-scripts\") pod \"413e722e-bb70-4638-983a-a1856e6ee4ec\" (UID: \"413e722e-bb70-4638-983a-a1856e6ee4ec\") " Mar 07 04:40:46 crc kubenswrapper[4689]: I0307 04:40:46.189299 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/413e722e-bb70-4638-983a-a1856e6ee4ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "413e722e-bb70-4638-983a-a1856e6ee4ec" (UID: "413e722e-bb70-4638-983a-a1856e6ee4ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:40:46 crc kubenswrapper[4689]: I0307 04:40:46.195956 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/413e722e-bb70-4638-983a-a1856e6ee4ec-kube-api-access-rv85n" (OuterVolumeSpecName: "kube-api-access-rv85n") pod "413e722e-bb70-4638-983a-a1856e6ee4ec" (UID: "413e722e-bb70-4638-983a-a1856e6ee4ec"). InnerVolumeSpecName "kube-api-access-rv85n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:40:46 crc kubenswrapper[4689]: I0307 04:40:46.289756 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv85n\" (UniqueName: \"kubernetes.io/projected/413e722e-bb70-4638-983a-a1856e6ee4ec-kube-api-access-rv85n\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:46 crc kubenswrapper[4689]: I0307 04:40:46.289788 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413e722e-bb70-4638-983a-a1856e6ee4ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:46 crc kubenswrapper[4689]: I0307 04:40:46.778430 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance9726-account-delete-chbgt" event={"ID":"413e722e-bb70-4638-983a-a1856e6ee4ec","Type":"ContainerDied","Data":"e05b109a693f99445447964f20deee86a915615e415fb5f778e9bfaab0a666ec"} Mar 07 04:40:46 crc kubenswrapper[4689]: I0307 04:40:46.778467 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e05b109a693f99445447964f20deee86a915615e415fb5f778e9bfaab0a666ec" Mar 07 04:40:46 crc kubenswrapper[4689]: I0307 04:40:46.778469 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance9726-account-delete-chbgt" Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.062439 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-l6584"] Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.074820 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-l6584"] Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.087406 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance9726-account-delete-chbgt"] Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.095159 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-9726-account-create-update-brmpz"] Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.102292 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance9726-account-delete-chbgt"] Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.110618 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-9726-account-create-update-brmpz"] Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.850308 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-prqhz"] Mar 07 04:40:48 crc kubenswrapper[4689]: E0307 04:40:48.850572 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413e722e-bb70-4638-983a-a1856e6ee4ec" containerName="mariadb-account-delete" Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.850585 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="413e722e-bb70-4638-983a-a1856e6ee4ec" containerName="mariadb-account-delete" Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.850734 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="413e722e-bb70-4638-983a-a1856e6ee4ec" containerName="mariadb-account-delete" Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.851187 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-prqhz" Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.862567 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-prqhz"] Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.954846 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gmmv\" (UniqueName: \"kubernetes.io/projected/7e18750b-34ac-4c21-8b43-7b6dad049da6-kube-api-access-2gmmv\") pod \"glance-db-create-prqhz\" (UID: \"7e18750b-34ac-4c21-8b43-7b6dad049da6\") " pod="glance-kuttl-tests/glance-db-create-prqhz" Mar 07 04:40:48 crc kubenswrapper[4689]: I0307 04:40:48.954932 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e18750b-34ac-4c21-8b43-7b6dad049da6-operator-scripts\") pod \"glance-db-create-prqhz\" (UID: \"7e18750b-34ac-4c21-8b43-7b6dad049da6\") " pod="glance-kuttl-tests/glance-db-create-prqhz" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.056323 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gmmv\" (UniqueName: \"kubernetes.io/projected/7e18750b-34ac-4c21-8b43-7b6dad049da6-kube-api-access-2gmmv\") pod \"glance-db-create-prqhz\" (UID: \"7e18750b-34ac-4c21-8b43-7b6dad049da6\") " pod="glance-kuttl-tests/glance-db-create-prqhz" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.056403 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e18750b-34ac-4c21-8b43-7b6dad049da6-operator-scripts\") pod \"glance-db-create-prqhz\" (UID: \"7e18750b-34ac-4c21-8b43-7b6dad049da6\") " pod="glance-kuttl-tests/glance-db-create-prqhz" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.057218 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-99c7-account-create-update-65cxn"] Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.057462 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e18750b-34ac-4c21-8b43-7b6dad049da6-operator-scripts\") pod \"glance-db-create-prqhz\" (UID: \"7e18750b-34ac-4c21-8b43-7b6dad049da6\") " pod="glance-kuttl-tests/glance-db-create-prqhz" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.058042 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.060531 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.070255 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-99c7-account-create-update-65cxn"] Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.076066 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gmmv\" (UniqueName: \"kubernetes.io/projected/7e18750b-34ac-4c21-8b43-7b6dad049da6-kube-api-access-2gmmv\") pod \"glance-db-create-prqhz\" (UID: \"7e18750b-34ac-4c21-8b43-7b6dad049da6\") " pod="glance-kuttl-tests/glance-db-create-prqhz" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.158256 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b14ab1c2-06d1-4279-bfa3-56af17e87ec2-operator-scripts\") pod \"glance-99c7-account-create-update-65cxn\" (UID: \"b14ab1c2-06d1-4279-bfa3-56af17e87ec2\") " pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.158454 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svvjb\" (UniqueName: \"kubernetes.io/projected/b14ab1c2-06d1-4279-bfa3-56af17e87ec2-kube-api-access-svvjb\") pod \"glance-99c7-account-create-update-65cxn\" (UID: \"b14ab1c2-06d1-4279-bfa3-56af17e87ec2\") " pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.165885 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-prqhz" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.259933 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b14ab1c2-06d1-4279-bfa3-56af17e87ec2-operator-scripts\") pod \"glance-99c7-account-create-update-65cxn\" (UID: \"b14ab1c2-06d1-4279-bfa3-56af17e87ec2\") " pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.260388 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svvjb\" (UniqueName: \"kubernetes.io/projected/b14ab1c2-06d1-4279-bfa3-56af17e87ec2-kube-api-access-svvjb\") pod \"glance-99c7-account-create-update-65cxn\" (UID: \"b14ab1c2-06d1-4279-bfa3-56af17e87ec2\") " pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.261384 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b14ab1c2-06d1-4279-bfa3-56af17e87ec2-operator-scripts\") pod \"glance-99c7-account-create-update-65cxn\" (UID: \"b14ab1c2-06d1-4279-bfa3-56af17e87ec2\") " pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.294910 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svvjb\" (UniqueName: \"kubernetes.io/projected/b14ab1c2-06d1-4279-bfa3-56af17e87ec2-kube-api-access-svvjb\") pod \"glance-99c7-account-create-update-65cxn\" (UID: \"b14ab1c2-06d1-4279-bfa3-56af17e87ec2\") " pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.376213 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.668791 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-prqhz"] Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.768094 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-99c7-account-create-update-65cxn"] Mar 07 04:40:49 crc kubenswrapper[4689]: W0307 04:40:49.797398 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb14ab1c2_06d1_4279_bfa3_56af17e87ec2.slice/crio-33d26090075addcea3430aa80fbd24bade2d1f067bae3105bdb76fbacb396b26 WatchSource:0}: Error finding container 33d26090075addcea3430aa80fbd24bade2d1f067bae3105bdb76fbacb396b26: Status 404 returned error can't find the container with id 33d26090075addcea3430aa80fbd24bade2d1f067bae3105bdb76fbacb396b26 Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.806562 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-prqhz" event={"ID":"7e18750b-34ac-4c21-8b43-7b6dad049da6","Type":"ContainerStarted","Data":"335f841741426b2bcab2a9f33416d23ac2c936f6cd4e9509529af5c39983c5a5"} Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.841624 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="304ba78d-8988-4227-a87a-188ec9bb00d7" path="/var/lib/kubelet/pods/304ba78d-8988-4227-a87a-188ec9bb00d7/volumes" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.843734 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="413e722e-bb70-4638-983a-a1856e6ee4ec" path="/var/lib/kubelet/pods/413e722e-bb70-4638-983a-a1856e6ee4ec/volumes" Mar 07 04:40:49 crc kubenswrapper[4689]: I0307 04:40:49.845830 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57bb4f6f-37c7-4ebb-8431-4371063f99a4" path="/var/lib/kubelet/pods/57bb4f6f-37c7-4ebb-8431-4371063f99a4/volumes" Mar 07 04:40:50 crc kubenswrapper[4689]: I0307 04:40:50.825997 4689 generic.go:334] "Generic (PLEG): container finished" podID="b14ab1c2-06d1-4279-bfa3-56af17e87ec2" containerID="cacb5db30963fd44e517647a02a03df3f2281fc0e783bb876f4cd09b706e1ca5" exitCode=0 Mar 07 04:40:50 crc kubenswrapper[4689]: I0307 04:40:50.826075 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" event={"ID":"b14ab1c2-06d1-4279-bfa3-56af17e87ec2","Type":"ContainerDied","Data":"cacb5db30963fd44e517647a02a03df3f2281fc0e783bb876f4cd09b706e1ca5"} Mar 07 04:40:50 crc kubenswrapper[4689]: I0307 04:40:50.828262 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" event={"ID":"b14ab1c2-06d1-4279-bfa3-56af17e87ec2","Type":"ContainerStarted","Data":"33d26090075addcea3430aa80fbd24bade2d1f067bae3105bdb76fbacb396b26"} Mar 07 04:40:50 crc kubenswrapper[4689]: I0307 04:40:50.837935 4689 generic.go:334] "Generic (PLEG): container finished" podID="7e18750b-34ac-4c21-8b43-7b6dad049da6" containerID="e3022f74671f98f056db6c0e381371a3e55ee18b9e417b193de56354d2305b91" exitCode=0 Mar 07 04:40:50 crc kubenswrapper[4689]: I0307 04:40:50.837993 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-prqhz" event={"ID":"7e18750b-34ac-4c21-8b43-7b6dad049da6","Type":"ContainerDied","Data":"e3022f74671f98f056db6c0e381371a3e55ee18b9e417b193de56354d2305b91"} Mar 07 04:40:51 crc kubenswrapper[4689]: I0307 04:40:51.182555 4689 scope.go:117] "RemoveContainer" containerID="d8d5ff449879a4e7c2e212532eda9528345078d1c13fa4d42b32b9082385d81e" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.160084 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-prqhz" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.171632 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.311388 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b14ab1c2-06d1-4279-bfa3-56af17e87ec2-operator-scripts\") pod \"b14ab1c2-06d1-4279-bfa3-56af17e87ec2\" (UID: \"b14ab1c2-06d1-4279-bfa3-56af17e87ec2\") " Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.311469 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svvjb\" (UniqueName: \"kubernetes.io/projected/b14ab1c2-06d1-4279-bfa3-56af17e87ec2-kube-api-access-svvjb\") pod \"b14ab1c2-06d1-4279-bfa3-56af17e87ec2\" (UID: \"b14ab1c2-06d1-4279-bfa3-56af17e87ec2\") " Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.311605 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e18750b-34ac-4c21-8b43-7b6dad049da6-operator-scripts\") pod \"7e18750b-34ac-4c21-8b43-7b6dad049da6\" (UID: \"7e18750b-34ac-4c21-8b43-7b6dad049da6\") " Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.311656 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gmmv\" (UniqueName: \"kubernetes.io/projected/7e18750b-34ac-4c21-8b43-7b6dad049da6-kube-api-access-2gmmv\") pod \"7e18750b-34ac-4c21-8b43-7b6dad049da6\" (UID: \"7e18750b-34ac-4c21-8b43-7b6dad049da6\") " Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.312354 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b14ab1c2-06d1-4279-bfa3-56af17e87ec2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b14ab1c2-06d1-4279-bfa3-56af17e87ec2" (UID: "b14ab1c2-06d1-4279-bfa3-56af17e87ec2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.312435 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e18750b-34ac-4c21-8b43-7b6dad049da6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e18750b-34ac-4c21-8b43-7b6dad049da6" (UID: "7e18750b-34ac-4c21-8b43-7b6dad049da6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.316816 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e18750b-34ac-4c21-8b43-7b6dad049da6-kube-api-access-2gmmv" (OuterVolumeSpecName: "kube-api-access-2gmmv") pod "7e18750b-34ac-4c21-8b43-7b6dad049da6" (UID: "7e18750b-34ac-4c21-8b43-7b6dad049da6"). InnerVolumeSpecName "kube-api-access-2gmmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.318454 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b14ab1c2-06d1-4279-bfa3-56af17e87ec2-kube-api-access-svvjb" (OuterVolumeSpecName: "kube-api-access-svvjb") pod "b14ab1c2-06d1-4279-bfa3-56af17e87ec2" (UID: "b14ab1c2-06d1-4279-bfa3-56af17e87ec2"). InnerVolumeSpecName "kube-api-access-svvjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.413940 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b14ab1c2-06d1-4279-bfa3-56af17e87ec2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.414002 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svvjb\" (UniqueName: \"kubernetes.io/projected/b14ab1c2-06d1-4279-bfa3-56af17e87ec2-kube-api-access-svvjb\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.414027 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e18750b-34ac-4c21-8b43-7b6dad049da6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.414050 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gmmv\" (UniqueName: \"kubernetes.io/projected/7e18750b-34ac-4c21-8b43-7b6dad049da6-kube-api-access-2gmmv\") on node \"crc\" DevicePath \"\"" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.856522 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" event={"ID":"b14ab1c2-06d1-4279-bfa3-56af17e87ec2","Type":"ContainerDied","Data":"33d26090075addcea3430aa80fbd24bade2d1f067bae3105bdb76fbacb396b26"} Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.856592 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33d26090075addcea3430aa80fbd24bade2d1f067bae3105bdb76fbacb396b26" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.856586 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-99c7-account-create-update-65cxn" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.858819 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-prqhz" event={"ID":"7e18750b-34ac-4c21-8b43-7b6dad049da6","Type":"ContainerDied","Data":"335f841741426b2bcab2a9f33416d23ac2c936f6cd4e9509529af5c39983c5a5"} Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.858886 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="335f841741426b2bcab2a9f33416d23ac2c936f6cd4e9509529af5c39983c5a5" Mar 07 04:40:52 crc kubenswrapper[4689]: I0307 04:40:52.858894 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-prqhz" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.210230 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-p2pl5"] Mar 07 04:40:54 crc kubenswrapper[4689]: E0307 04:40:54.210863 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e18750b-34ac-4c21-8b43-7b6dad049da6" containerName="mariadb-database-create" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.210877 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e18750b-34ac-4c21-8b43-7b6dad049da6" containerName="mariadb-database-create" Mar 07 04:40:54 crc kubenswrapper[4689]: E0307 04:40:54.210894 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b14ab1c2-06d1-4279-bfa3-56af17e87ec2" containerName="mariadb-account-create-update" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.210902 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14ab1c2-06d1-4279-bfa3-56af17e87ec2" containerName="mariadb-account-create-update" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.211074 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="b14ab1c2-06d1-4279-bfa3-56af17e87ec2" containerName="mariadb-account-create-update" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.211097 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e18750b-34ac-4c21-8b43-7b6dad049da6" containerName="mariadb-database-create" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.211661 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.215027 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.215354 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-xzcfg" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.224416 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-p2pl5"] Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.340905 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a35dda23-58ef-41e5-828a-46b51a98acb7-db-sync-config-data\") pod \"glance-db-sync-p2pl5\" (UID: \"a35dda23-58ef-41e5-828a-46b51a98acb7\") " pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.341118 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvn8w\" (UniqueName: \"kubernetes.io/projected/a35dda23-58ef-41e5-828a-46b51a98acb7-kube-api-access-rvn8w\") pod \"glance-db-sync-p2pl5\" (UID: \"a35dda23-58ef-41e5-828a-46b51a98acb7\") " pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.341466 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35dda23-58ef-41e5-828a-46b51a98acb7-config-data\") pod \"glance-db-sync-p2pl5\" (UID: \"a35dda23-58ef-41e5-828a-46b51a98acb7\") " pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.443071 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35dda23-58ef-41e5-828a-46b51a98acb7-config-data\") pod \"glance-db-sync-p2pl5\" (UID: \"a35dda23-58ef-41e5-828a-46b51a98acb7\") " pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.443422 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a35dda23-58ef-41e5-828a-46b51a98acb7-db-sync-config-data\") pod \"glance-db-sync-p2pl5\" (UID: \"a35dda23-58ef-41e5-828a-46b51a98acb7\") " pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.443642 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvn8w\" (UniqueName: \"kubernetes.io/projected/a35dda23-58ef-41e5-828a-46b51a98acb7-kube-api-access-rvn8w\") pod \"glance-db-sync-p2pl5\" (UID: \"a35dda23-58ef-41e5-828a-46b51a98acb7\") " pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.449341 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a35dda23-58ef-41e5-828a-46b51a98acb7-db-sync-config-data\") pod \"glance-db-sync-p2pl5\" (UID: \"a35dda23-58ef-41e5-828a-46b51a98acb7\") " pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.449671 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35dda23-58ef-41e5-828a-46b51a98acb7-config-data\") pod \"glance-db-sync-p2pl5\" (UID: \"a35dda23-58ef-41e5-828a-46b51a98acb7\") " pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.465755 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvn8w\" (UniqueName: \"kubernetes.io/projected/a35dda23-58ef-41e5-828a-46b51a98acb7-kube-api-access-rvn8w\") pod \"glance-db-sync-p2pl5\" (UID: \"a35dda23-58ef-41e5-828a-46b51a98acb7\") " pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.527329 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:40:54 crc kubenswrapper[4689]: W0307 04:40:54.772763 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda35dda23_58ef_41e5_828a_46b51a98acb7.slice/crio-43ed8f3bc7470f6900c20b6c39519b707e87f8012881bbbd65b7e7eba295499e WatchSource:0}: Error finding container 43ed8f3bc7470f6900c20b6c39519b707e87f8012881bbbd65b7e7eba295499e: Status 404 returned error can't find the container with id 43ed8f3bc7470f6900c20b6c39519b707e87f8012881bbbd65b7e7eba295499e Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.773284 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-p2pl5"] Mar 07 04:40:54 crc kubenswrapper[4689]: I0307 04:40:54.873951 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-p2pl5" event={"ID":"a35dda23-58ef-41e5-828a-46b51a98acb7","Type":"ContainerStarted","Data":"43ed8f3bc7470f6900c20b6c39519b707e87f8012881bbbd65b7e7eba295499e"} Mar 07 04:40:55 crc kubenswrapper[4689]: I0307 04:40:55.883063 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-p2pl5" event={"ID":"a35dda23-58ef-41e5-828a-46b51a98acb7","Type":"ContainerStarted","Data":"4a7450be8431eb55a40d28f678bab089af116d3d77f3e7540640b150e3601f72"} Mar 07 04:40:55 crc kubenswrapper[4689]: I0307 04:40:55.907139 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-p2pl5" podStartSLOduration=1.907115575 podStartE2EDuration="1.907115575s" podCreationTimestamp="2026-03-07 04:40:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:40:55.900730242 +0000 UTC m=+1300.947113731" watchObservedRunningTime="2026-03-07 04:40:55.907115575 +0000 UTC m=+1300.953499084" Mar 07 04:40:58 crc kubenswrapper[4689]: I0307 04:40:58.910431 4689 generic.go:334] "Generic (PLEG): container finished" podID="a35dda23-58ef-41e5-828a-46b51a98acb7" containerID="4a7450be8431eb55a40d28f678bab089af116d3d77f3e7540640b150e3601f72" exitCode=0 Mar 07 04:40:58 crc kubenswrapper[4689]: I0307 04:40:58.910543 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-p2pl5" event={"ID":"a35dda23-58ef-41e5-828a-46b51a98acb7","Type":"ContainerDied","Data":"4a7450be8431eb55a40d28f678bab089af116d3d77f3e7540640b150e3601f72"} Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.282595 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.330816 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a35dda23-58ef-41e5-828a-46b51a98acb7-db-sync-config-data\") pod \"a35dda23-58ef-41e5-828a-46b51a98acb7\" (UID: \"a35dda23-58ef-41e5-828a-46b51a98acb7\") " Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.330989 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvn8w\" (UniqueName: \"kubernetes.io/projected/a35dda23-58ef-41e5-828a-46b51a98acb7-kube-api-access-rvn8w\") pod \"a35dda23-58ef-41e5-828a-46b51a98acb7\" (UID: \"a35dda23-58ef-41e5-828a-46b51a98acb7\") " Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.331059 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35dda23-58ef-41e5-828a-46b51a98acb7-config-data\") pod \"a35dda23-58ef-41e5-828a-46b51a98acb7\" (UID: \"a35dda23-58ef-41e5-828a-46b51a98acb7\") " Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.337327 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a35dda23-58ef-41e5-828a-46b51a98acb7-kube-api-access-rvn8w" (OuterVolumeSpecName: "kube-api-access-rvn8w") pod "a35dda23-58ef-41e5-828a-46b51a98acb7" (UID: "a35dda23-58ef-41e5-828a-46b51a98acb7"). InnerVolumeSpecName "kube-api-access-rvn8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.341428 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35dda23-58ef-41e5-828a-46b51a98acb7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a35dda23-58ef-41e5-828a-46b51a98acb7" (UID: "a35dda23-58ef-41e5-828a-46b51a98acb7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.382970 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35dda23-58ef-41e5-828a-46b51a98acb7-config-data" (OuterVolumeSpecName: "config-data") pod "a35dda23-58ef-41e5-828a-46b51a98acb7" (UID: "a35dda23-58ef-41e5-828a-46b51a98acb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.433040 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvn8w\" (UniqueName: \"kubernetes.io/projected/a35dda23-58ef-41e5-828a-46b51a98acb7-kube-api-access-rvn8w\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.433084 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35dda23-58ef-41e5-828a-46b51a98acb7-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.433097 4689 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a35dda23-58ef-41e5-828a-46b51a98acb7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.941772 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-p2pl5" event={"ID":"a35dda23-58ef-41e5-828a-46b51a98acb7","Type":"ContainerDied","Data":"43ed8f3bc7470f6900c20b6c39519b707e87f8012881bbbd65b7e7eba295499e"} Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.941822 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43ed8f3bc7470f6900c20b6c39519b707e87f8012881bbbd65b7e7eba295499e" Mar 07 04:41:00 crc kubenswrapper[4689]: I0307 04:41:00.941837 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-p2pl5" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.171116 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:41:02 crc kubenswrapper[4689]: E0307 04:41:02.172512 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a35dda23-58ef-41e5-828a-46b51a98acb7" containerName="glance-db-sync" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.172535 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35dda23-58ef-41e5-828a-46b51a98acb7" containerName="glance-db-sync" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.172779 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="a35dda23-58ef-41e5-828a-46b51a98acb7" containerName="glance-db-sync" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.174005 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.175819 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.176907 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.178297 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-xzcfg" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.190566 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265195 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c06d64-fd58-4794-8a36-e3036d6d728f-scripts\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265254 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c06d64-fd58-4794-8a36-e3036d6d728f-logs\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265318 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265359 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82tp5\" (UniqueName: \"kubernetes.io/projected/52c06d64-fd58-4794-8a36-e3036d6d728f-kube-api-access-82tp5\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265387 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-sys\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265414 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-dev\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265430 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265445 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-run\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265467 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52c06d64-fd58-4794-8a36-e3036d6d728f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265573 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265648 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c06d64-fd58-4794-8a36-e3036d6d728f-config-data\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265717 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265757 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.265796 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.354257 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.355893 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.360651 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.366515 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c06d64-fd58-4794-8a36-e3036d6d728f-scripts\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.366548 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c06d64-fd58-4794-8a36-e3036d6d728f-logs\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.366574 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.366598 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82tp5\" (UniqueName: \"kubernetes.io/projected/52c06d64-fd58-4794-8a36-e3036d6d728f-kube-api-access-82tp5\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.366895 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.367049 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c06d64-fd58-4794-8a36-e3036d6d728f-logs\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.367150 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-sys\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.370565 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-sys\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.376128 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c06d64-fd58-4794-8a36-e3036d6d728f-scripts\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377327 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-dev\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377377 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377407 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-run\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377448 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52c06d64-fd58-4794-8a36-e3036d6d728f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377507 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377548 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c06d64-fd58-4794-8a36-e3036d6d728f-config-data\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377613 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377631 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377651 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377745 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377784 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-dev\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377806 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.377827 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-run\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.378140 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52c06d64-fd58-4794-8a36-e3036d6d728f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.378317 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.378461 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.380284 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.382686 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c06d64-fd58-4794-8a36-e3036d6d728f-config-data\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.399149 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.400140 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.403351 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82tp5\" (UniqueName: \"kubernetes.io/projected/52c06d64-fd58-4794-8a36-e3036d6d728f-kube-api-access-82tp5\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.403438 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.478852 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed33553-b5b4-449c-bda1-6c01a8c65e41-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.478892 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-dev\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.478913 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.478936 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-sys\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.478952 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6gvn\" (UniqueName: \"kubernetes.io/projected/5ed33553-b5b4-449c-bda1-6c01a8c65e41-kube-api-access-b6gvn\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.478967 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.478997 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ed33553-b5b4-449c-bda1-6c01a8c65e41-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.479031 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-run\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.479046 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.479085 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.479102 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed33553-b5b4-449c-bda1-6c01a8c65e41-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.479130 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.479144 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.479157 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ed33553-b5b4-449c-bda1-6c01a8c65e41-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.504120 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.580837 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.580876 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed33553-b5b4-449c-bda1-6c01a8c65e41-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.580907 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.580922 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.580938 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ed33553-b5b4-449c-bda1-6c01a8c65e41-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.580959 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-dev\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.580972 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed33553-b5b4-449c-bda1-6c01a8c65e41-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.580991 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.581011 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-sys\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.581027 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6gvn\" (UniqueName: \"kubernetes.io/projected/5ed33553-b5b4-449c-bda1-6c01a8c65e41-kube-api-access-b6gvn\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.581047 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.581072 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ed33553-b5b4-449c-bda1-6c01a8c65e41-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.581116 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-run\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.581133 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.581233 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.581585 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.582699 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.582932 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.583854 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-dev\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.583960 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.584245 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-run\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.584280 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-sys\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.584281 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.584482 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ed33553-b5b4-449c-bda1-6c01a8c65e41-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.584658 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed33553-b5b4-449c-bda1-6c01a8c65e41-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.585980 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ed33553-b5b4-449c-bda1-6c01a8c65e41-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.586489 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed33553-b5b4-449c-bda1-6c01a8c65e41-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.606497 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.609896 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6gvn\" (UniqueName: \"kubernetes.io/projected/5ed33553-b5b4-449c-bda1-6c01a8c65e41-kube-api-access-b6gvn\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.634834 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:02 crc kubenswrapper[4689]: I0307 04:41:02.670686 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:03 crc kubenswrapper[4689]: I0307 04:41:03.017940 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:41:03 crc kubenswrapper[4689]: I0307 04:41:03.060565 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:41:03 crc kubenswrapper[4689]: I0307 04:41:03.124052 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:41:03 crc kubenswrapper[4689]: I0307 04:41:03.971505 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"5ed33553-b5b4-449c-bda1-6c01a8c65e41","Type":"ContainerStarted","Data":"dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6"} Mar 07 04:41:03 crc kubenswrapper[4689]: I0307 04:41:03.971679 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="5ed33553-b5b4-449c-bda1-6c01a8c65e41" containerName="glance-log" containerID="cri-o://3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98" gracePeriod=30 Mar 07 04:41:03 crc kubenswrapper[4689]: I0307 04:41:03.971736 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="5ed33553-b5b4-449c-bda1-6c01a8c65e41" containerName="glance-httpd" containerID="cri-o://dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6" gracePeriod=30 Mar 07 04:41:03 crc kubenswrapper[4689]: I0307 04:41:03.972411 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"5ed33553-b5b4-449c-bda1-6c01a8c65e41","Type":"ContainerStarted","Data":"3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98"} Mar 07 04:41:03 crc kubenswrapper[4689]: I0307 04:41:03.972463 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"5ed33553-b5b4-449c-bda1-6c01a8c65e41","Type":"ContainerStarted","Data":"dd59768bf1904c1c7a6724d3ee4eeaccc3b0c7b9dcd64d06b1dbaea66eb546dd"} Mar 07 04:41:03 crc kubenswrapper[4689]: I0307 04:41:03.977892 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"52c06d64-fd58-4794-8a36-e3036d6d728f","Type":"ContainerStarted","Data":"1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38"} Mar 07 04:41:03 crc kubenswrapper[4689]: I0307 04:41:03.977940 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"52c06d64-fd58-4794-8a36-e3036d6d728f","Type":"ContainerStarted","Data":"7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116"} Mar 07 04:41:03 crc kubenswrapper[4689]: I0307 04:41:03.977958 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"52c06d64-fd58-4794-8a36-e3036d6d728f","Type":"ContainerStarted","Data":"0e8fb8f09cadbc4d6110cad6b839d3cf24a01765b5a321c4cb30dc3c7c0baf2d"} Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.016712 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.016689189 podStartE2EDuration="3.016689189s" podCreationTimestamp="2026-03-07 04:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:41:04.010036439 +0000 UTC m=+1309.056419938" watchObservedRunningTime="2026-03-07 04:41:04.016689189 +0000 UTC m=+1309.063072688" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.037507 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.037480818 podStartE2EDuration="2.037480818s" podCreationTimestamp="2026-03-07 04:41:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:41:04.031876627 +0000 UTC m=+1309.078260146" watchObservedRunningTime="2026-03-07 04:41:04.037480818 +0000 UTC m=+1309.083864317" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.382100 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.407037 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ed33553-b5b4-449c-bda1-6c01a8c65e41-scripts\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.407369 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.407489 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-sys\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.407663 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-etc-nvme\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.407760 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-run\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.407861 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-var-locks-brick\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.407955 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-etc-iscsi\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.408049 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed33553-b5b4-449c-bda1-6c01a8c65e41-logs\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.408158 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed33553-b5b4-449c-bda1-6c01a8c65e41-config-data\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.408319 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.408414 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-lib-modules\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.408531 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6gvn\" (UniqueName: \"kubernetes.io/projected/5ed33553-b5b4-449c-bda1-6c01a8c65e41-kube-api-access-b6gvn\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.408613 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-dev\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.408692 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ed33553-b5b4-449c-bda1-6c01a8c65e41-httpd-run\") pod \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\" (UID: \"5ed33553-b5b4-449c-bda1-6c01a8c65e41\") " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.409531 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed33553-b5b4-449c-bda1-6c01a8c65e41-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.409693 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed33553-b5b4-449c-bda1-6c01a8c65e41-logs" (OuterVolumeSpecName: "logs") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.409726 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-sys" (OuterVolumeSpecName: "sys") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.409745 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.409765 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-run" (OuterVolumeSpecName: "run") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.409780 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.409832 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.409852 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.410688 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-dev" (OuterVolumeSpecName: "dev") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.414462 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed33553-b5b4-449c-bda1-6c01a8c65e41-kube-api-access-b6gvn" (OuterVolumeSpecName: "kube-api-access-b6gvn") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "kube-api-access-b6gvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.414972 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed33553-b5b4-449c-bda1-6c01a8c65e41-scripts" (OuterVolumeSpecName: "scripts") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.418310 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.418401 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance-cache") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.446395 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed33553-b5b4-449c-bda1-6c01a8c65e41-config-data" (OuterVolumeSpecName: "config-data") pod "5ed33553-b5b4-449c-bda1-6c01a8c65e41" (UID: "5ed33553-b5b4-449c-bda1-6c01a8c65e41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511146 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511220 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511241 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed33553-b5b4-449c-bda1-6c01a8c65e41-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511258 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed33553-b5b4-449c-bda1-6c01a8c65e41-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511304 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511326 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511354 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6gvn\" (UniqueName: \"kubernetes.io/projected/5ed33553-b5b4-449c-bda1-6c01a8c65e41-kube-api-access-b6gvn\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511377 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511398 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ed33553-b5b4-449c-bda1-6c01a8c65e41-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511417 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ed33553-b5b4-449c-bda1-6c01a8c65e41-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511444 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511464 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511486 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.511508 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ed33553-b5b4-449c-bda1-6c01a8c65e41-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.535017 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.535034 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.612862 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.613096 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.991692 4689 generic.go:334] "Generic (PLEG): container finished" podID="5ed33553-b5b4-449c-bda1-6c01a8c65e41" containerID="dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6" exitCode=143 Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.992367 4689 generic.go:334] "Generic (PLEG): container finished" podID="5ed33553-b5b4-449c-bda1-6c01a8c65e41" containerID="3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98" exitCode=143 Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.991798 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"5ed33553-b5b4-449c-bda1-6c01a8c65e41","Type":"ContainerDied","Data":"dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6"} Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.992541 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"5ed33553-b5b4-449c-bda1-6c01a8c65e41","Type":"ContainerDied","Data":"3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98"} Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.992588 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"5ed33553-b5b4-449c-bda1-6c01a8c65e41","Type":"ContainerDied","Data":"dd59768bf1904c1c7a6724d3ee4eeaccc3b0c7b9dcd64d06b1dbaea66eb546dd"} Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.992630 4689 scope.go:117] "RemoveContainer" containerID="dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6" Mar 07 04:41:04 crc kubenswrapper[4689]: I0307 04:41:04.991842 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.032691 4689 scope.go:117] "RemoveContainer" containerID="3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.047160 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.074650 4689 scope.go:117] "RemoveContainer" containerID="dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6" Mar 07 04:41:05 crc kubenswrapper[4689]: E0307 04:41:05.086112 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6\": container with ID starting with dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6 not found: ID does not exist" containerID="dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.086268 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6"} err="failed to get container status \"dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6\": rpc error: code = NotFound desc = could not find container \"dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6\": container with ID starting with dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6 not found: ID does not exist" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.086325 4689 scope.go:117] "RemoveContainer" containerID="3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98" Mar 07 04:41:05 crc kubenswrapper[4689]: E0307 04:41:05.089868 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98\": container with ID starting with 3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98 not found: ID does not exist" containerID="3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.089942 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98"} err="failed to get container status \"3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98\": rpc error: code = NotFound desc = could not find container \"3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98\": container with ID starting with 3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98 not found: ID does not exist" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.089976 4689 scope.go:117] "RemoveContainer" containerID="dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.090456 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6"} err="failed to get container status \"dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6\": rpc error: code = NotFound desc = could not find container \"dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6\": container with ID starting with dce9183ca4d8ec6bfcd64c75f747afd4813e1ed15c336b392a5380648c5353e6 not found: ID does not exist" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.090517 4689 scope.go:117] "RemoveContainer" containerID="3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.090898 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98"} err="failed to get container status \"3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98\": rpc error: code = NotFound desc = could not find container \"3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98\": container with ID starting with 3e90aae20d5fe79a57de28a27a4cc657c359412f333eb75905003a403aae0e98 not found: ID does not exist" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.092641 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.110693 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:41:05 crc kubenswrapper[4689]: E0307 04:41:05.111302 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed33553-b5b4-449c-bda1-6c01a8c65e41" containerName="glance-httpd" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.111445 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed33553-b5b4-449c-bda1-6c01a8c65e41" containerName="glance-httpd" Mar 07 04:41:05 crc kubenswrapper[4689]: E0307 04:41:05.111562 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed33553-b5b4-449c-bda1-6c01a8c65e41" containerName="glance-log" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.111667 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed33553-b5b4-449c-bda1-6c01a8c65e41" containerName="glance-log" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.111931 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed33553-b5b4-449c-bda1-6c01a8c65e41" containerName="glance-httpd" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.112024 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed33553-b5b4-449c-bda1-6c01a8c65e41" containerName="glance-log" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.113022 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.115331 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.121788 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.220549 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.221294 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.221469 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.221572 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-dev\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.221704 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.221828 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.221925 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.222009 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.222128 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-run\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.222338 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-sys\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.222516 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mkjg\" (UniqueName: \"kubernetes.io/projected/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-kube-api-access-5mkjg\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.222673 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.222806 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.222965 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.324837 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325256 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325330 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325352 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-dev\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325383 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325412 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325435 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325456 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325483 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-run\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325521 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-sys\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325563 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mkjg\" (UniqueName: \"kubernetes.io/projected/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-kube-api-access-5mkjg\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325599 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325617 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325644 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325750 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.325004 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.326231 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-run\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.326269 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-sys\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.326301 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.326609 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.327109 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-dev\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.327144 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.327474 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.327581 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.327813 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.347053 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.347565 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.360112 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mkjg\" (UniqueName: \"kubernetes.io/projected/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-kube-api-access-5mkjg\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.367162 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.370374 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.430453 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.836961 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed33553-b5b4-449c-bda1-6c01a8c65e41" path="/var/lib/kubelet/pods/5ed33553-b5b4-449c-bda1-6c01a8c65e41/volumes" Mar 07 04:41:05 crc kubenswrapper[4689]: I0307 04:41:05.907909 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:41:05 crc kubenswrapper[4689]: W0307 04:41:05.918296 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9ebe9af_b36d_48b6_b5a4_7a8b9031b7c6.slice/crio-2a3b0529581393e12bc5f398fc24ec0b92866625257ee208289ff24832501cf2 WatchSource:0}: Error finding container 2a3b0529581393e12bc5f398fc24ec0b92866625257ee208289ff24832501cf2: Status 404 returned error can't find the container with id 2a3b0529581393e12bc5f398fc24ec0b92866625257ee208289ff24832501cf2 Mar 07 04:41:06 crc kubenswrapper[4689]: I0307 04:41:06.000887 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6","Type":"ContainerStarted","Data":"2a3b0529581393e12bc5f398fc24ec0b92866625257ee208289ff24832501cf2"} Mar 07 04:41:07 crc kubenswrapper[4689]: I0307 04:41:07.013035 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6","Type":"ContainerStarted","Data":"908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31"} Mar 07 04:41:07 crc kubenswrapper[4689]: I0307 04:41:07.013394 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6","Type":"ContainerStarted","Data":"c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b"} Mar 07 04:41:07 crc kubenswrapper[4689]: I0307 04:41:07.035491 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.035467335 podStartE2EDuration="2.035467335s" podCreationTimestamp="2026-03-07 04:41:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:41:07.031526618 +0000 UTC m=+1312.077910107" watchObservedRunningTime="2026-03-07 04:41:07.035467335 +0000 UTC m=+1312.081850844" Mar 07 04:41:12 crc kubenswrapper[4689]: I0307 04:41:12.505476 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:12 crc kubenswrapper[4689]: I0307 04:41:12.506137 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:12 crc kubenswrapper[4689]: I0307 04:41:12.542396 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:12 crc kubenswrapper[4689]: I0307 04:41:12.570946 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:13 crc kubenswrapper[4689]: I0307 04:41:13.077651 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:13 crc kubenswrapper[4689]: I0307 04:41:13.077706 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:15 crc kubenswrapper[4689]: I0307 04:41:15.060063 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:15 crc kubenswrapper[4689]: I0307 04:41:15.093049 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:15 crc kubenswrapper[4689]: I0307 04:41:15.431285 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:15 crc kubenswrapper[4689]: I0307 04:41:15.432398 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:15 crc kubenswrapper[4689]: I0307 04:41:15.478714 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:15 crc kubenswrapper[4689]: I0307 04:41:15.499379 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:16 crc kubenswrapper[4689]: I0307 04:41:16.107863 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:16 crc kubenswrapper[4689]: I0307 04:41:16.107934 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:17 crc kubenswrapper[4689]: I0307 04:41:17.996323 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:18 crc kubenswrapper[4689]: I0307 04:41:18.001830 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.350252 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.352141 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.368664 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.371081 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.376433 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.383656 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.502806 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-sys\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.502877 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.502906 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.502931 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.502972 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-run\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.502994 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16de902a-2867-4549-b244-3a3752df534a-config-data\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503050 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503071 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503112 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503147 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4e746e3-9e78-4346-bead-cfe055fccbe6-logs\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503188 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-sys\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503221 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e746e3-9e78-4346-bead-cfe055fccbe6-config-data\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503328 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503370 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503391 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzwjz\" (UniqueName: \"kubernetes.io/projected/d4e746e3-9e78-4346-bead-cfe055fccbe6-kube-api-access-gzwjz\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503431 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16de902a-2867-4549-b244-3a3752df534a-scripts\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503453 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503479 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503520 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzj45\" (UniqueName: \"kubernetes.io/projected/16de902a-2867-4549-b244-3a3752df534a-kube-api-access-xzj45\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503544 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4e746e3-9e78-4346-bead-cfe055fccbe6-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503567 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e746e3-9e78-4346-bead-cfe055fccbe6-scripts\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503609 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-dev\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503635 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503680 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-run\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503705 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-dev\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503725 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16de902a-2867-4549-b244-3a3752df534a-logs\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503783 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16de902a-2867-4549-b244-3a3752df534a-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.503802 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.576957 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.578061 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.589329 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.613973 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4e746e3-9e78-4346-bead-cfe055fccbe6-logs\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614052 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-sys\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614104 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e746e3-9e78-4346-bead-cfe055fccbe6-config-data\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614208 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614249 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614291 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzwjz\" (UniqueName: \"kubernetes.io/projected/d4e746e3-9e78-4346-bead-cfe055fccbe6-kube-api-access-gzwjz\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614328 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16de902a-2867-4549-b244-3a3752df534a-scripts\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614369 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614417 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614506 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzj45\" (UniqueName: \"kubernetes.io/projected/16de902a-2867-4549-b244-3a3752df534a-kube-api-access-xzj45\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614551 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4e746e3-9e78-4346-bead-cfe055fccbe6-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614595 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e746e3-9e78-4346-bead-cfe055fccbe6-scripts\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614637 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-dev\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614681 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614728 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-run\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614768 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-dev\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614802 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16de902a-2867-4549-b244-3a3752df534a-logs\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614869 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16de902a-2867-4549-b244-3a3752df534a-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614908 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614945 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-sys\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.614989 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.615033 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.615228 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.615267 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-run\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.615308 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16de902a-2867-4549-b244-3a3752df534a-config-data\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.615367 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.615412 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.615453 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.615662 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.615734 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.616099 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.616863 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-dev\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.617341 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-dev\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.617588 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4e746e3-9e78-4346-bead-cfe055fccbe6-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.617641 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.618147 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16de902a-2867-4549-b244-3a3752df534a-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.618344 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-run\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.618564 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16de902a-2867-4549-b244-3a3752df534a-logs\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.618611 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.618622 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.618644 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-sys\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.618666 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.618686 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.618893 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4e746e3-9e78-4346-bead-cfe055fccbe6-logs\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.618933 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-sys\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.618910 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-run\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.622481 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.623288 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.621972 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.623435 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.623707 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.636260 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e746e3-9e78-4346-bead-cfe055fccbe6-config-data\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.637600 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.647513 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e746e3-9e78-4346-bead-cfe055fccbe6-scripts\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.647703 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16de902a-2867-4549-b244-3a3752df534a-scripts\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.648699 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzwjz\" (UniqueName: \"kubernetes.io/projected/d4e746e3-9e78-4346-bead-cfe055fccbe6-kube-api-access-gzwjz\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.650399 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16de902a-2867-4549-b244-3a3752df534a-config-data\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.651048 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.654138 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.659827 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzj45\" (UniqueName: \"kubernetes.io/projected/16de902a-2867-4549-b244-3a3752df534a-kube-api-access-xzj45\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.679706 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.681569 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-2\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.688565 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.694416 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.717388 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.717666 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.717856 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-sys\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.717936 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.717955 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1190d2df-d031-4c4d-87cf-237581d0cc4c-scripts\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.717971 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1190d2df-d031-4c4d-87cf-237581d0cc4c-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.717992 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718012 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718048 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a61d36-81f1-41d7-b61e-29969a17ddc1-logs\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718094 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a61d36-81f1-41d7-b61e-29969a17ddc1-scripts\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718141 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8a61d36-81f1-41d7-b61e-29969a17ddc1-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718209 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1190d2df-d031-4c4d-87cf-237581d0cc4c-logs\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718226 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718250 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-dev\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718269 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-sys\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718297 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-run\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718332 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-dev\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718353 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718376 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718398 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8tm5\" (UniqueName: \"kubernetes.io/projected/e8a61d36-81f1-41d7-b61e-29969a17ddc1-kube-api-access-b8tm5\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718414 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718436 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jss4p\" (UniqueName: \"kubernetes.io/projected/1190d2df-d031-4c4d-87cf-237581d0cc4c-kube-api-access-jss4p\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718451 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a61d36-81f1-41d7-b61e-29969a17ddc1-config-data\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718475 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-run\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718502 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718528 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718542 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1190d2df-d031-4c4d-87cf-237581d0cc4c-config-data\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.718563 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.820577 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1190d2df-d031-4c4d-87cf-237581d0cc4c-scripts\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.820629 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.820661 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1190d2df-d031-4c4d-87cf-237581d0cc4c-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.820688 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.820719 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.820767 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a61d36-81f1-41d7-b61e-29969a17ddc1-logs\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.820826 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a61d36-81f1-41d7-b61e-29969a17ddc1-scripts\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.820890 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8a61d36-81f1-41d7-b61e-29969a17ddc1-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.820932 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.820958 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1190d2df-d031-4c4d-87cf-237581d0cc4c-logs\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821010 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821039 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-dev\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821063 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-sys\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821096 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-run\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821130 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-dev\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821153 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821187 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821212 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8tm5\" (UniqueName: \"kubernetes.io/projected/e8a61d36-81f1-41d7-b61e-29969a17ddc1-kube-api-access-b8tm5\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821230 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821255 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jss4p\" (UniqueName: \"kubernetes.io/projected/1190d2df-d031-4c4d-87cf-237581d0cc4c-kube-api-access-jss4p\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821272 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a61d36-81f1-41d7-b61e-29969a17ddc1-config-data\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821294 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-run\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821356 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821388 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821408 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1190d2df-d031-4c4d-87cf-237581d0cc4c-config-data\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821436 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821446 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1190d2df-d031-4c4d-87cf-237581d0cc4c-logs\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821477 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821503 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821514 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821528 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-sys\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821641 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-sys\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821833 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a61d36-81f1-41d7-b61e-29969a17ddc1-logs\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821905 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822186 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1190d2df-d031-4c4d-87cf-237581d0cc4c-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822221 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8a61d36-81f1-41d7-b61e-29969a17ddc1-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822233 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-dev\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822269 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-run\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822295 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-dev\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822289 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-sys\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822373 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822394 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822660 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822708 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822737 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-run\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822771 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.822814 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.823198 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.821903 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.824840 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.827994 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a61d36-81f1-41d7-b61e-29969a17ddc1-scripts\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.828480 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a61d36-81f1-41d7-b61e-29969a17ddc1-config-data\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.845439 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1190d2df-d031-4c4d-87cf-237581d0cc4c-config-data\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.846975 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1190d2df-d031-4c4d-87cf-237581d0cc4c-scripts\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.849422 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jss4p\" (UniqueName: \"kubernetes.io/projected/1190d2df-d031-4c4d-87cf-237581d0cc4c-kube-api-access-jss4p\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.855346 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.855706 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.856662 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-1\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.861072 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.861282 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8tm5\" (UniqueName: \"kubernetes.io/projected/e8a61d36-81f1-41d7-b61e-29969a17ddc1-kube-api-access-b8tm5\") pod \"glance-default-internal-api-2\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:20 crc kubenswrapper[4689]: I0307 04:41:20.975887 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:21 crc kubenswrapper[4689]: I0307 04:41:21.052207 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:21 crc kubenswrapper[4689]: I0307 04:41:21.069689 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:21 crc kubenswrapper[4689]: I0307 04:41:21.137453 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:41:21 crc kubenswrapper[4689]: I0307 04:41:21.174722 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"16de902a-2867-4549-b244-3a3752df534a","Type":"ContainerStarted","Data":"316689990ab62ddad95d6cb8d7c610f68ec3d3c8aa948eb94a45307e8a176fcc"} Mar 07 04:41:21 crc kubenswrapper[4689]: I0307 04:41:21.500234 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Mar 07 04:41:21 crc kubenswrapper[4689]: I0307 04:41:21.554914 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Mar 07 04:41:21 crc kubenswrapper[4689]: I0307 04:41:21.588224 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:41:21 crc kubenswrapper[4689]: W0307 04:41:21.605213 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1190d2df_d031_4c4d_87cf_237581d0cc4c.slice/crio-24c2a0bce7c5d7e37c7b6fbef9c99fb74d6941e3431dd8389a555bc713d42a47 WatchSource:0}: Error finding container 24c2a0bce7c5d7e37c7b6fbef9c99fb74d6941e3431dd8389a555bc713d42a47: Status 404 returned error can't find the container with id 24c2a0bce7c5d7e37c7b6fbef9c99fb74d6941e3431dd8389a555bc713d42a47 Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.198978 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"d4e746e3-9e78-4346-bead-cfe055fccbe6","Type":"ContainerStarted","Data":"f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2"} Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.199532 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"d4e746e3-9e78-4346-bead-cfe055fccbe6","Type":"ContainerStarted","Data":"75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd"} Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.199548 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"d4e746e3-9e78-4346-bead-cfe055fccbe6","Type":"ContainerStarted","Data":"46a27ed3cea0106aef2356d6e505c9a213a72c88e42ec950ee3c28ba41afb185"} Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.203783 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e8a61d36-81f1-41d7-b61e-29969a17ddc1","Type":"ContainerStarted","Data":"be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21"} Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.203828 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e8a61d36-81f1-41d7-b61e-29969a17ddc1","Type":"ContainerStarted","Data":"89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483"} Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.203841 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e8a61d36-81f1-41d7-b61e-29969a17ddc1","Type":"ContainerStarted","Data":"186f8005b849e2f4df919c3b86667cc32b19122116f9c75abb96c9fe95175bd6"} Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.207339 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1190d2df-d031-4c4d-87cf-237581d0cc4c","Type":"ContainerStarted","Data":"53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb"} Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.207406 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1190d2df-d031-4c4d-87cf-237581d0cc4c","Type":"ContainerStarted","Data":"5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66"} Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.207432 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1190d2df-d031-4c4d-87cf-237581d0cc4c","Type":"ContainerStarted","Data":"24c2a0bce7c5d7e37c7b6fbef9c99fb74d6941e3431dd8389a555bc713d42a47"} Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.212203 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"16de902a-2867-4549-b244-3a3752df534a","Type":"ContainerStarted","Data":"16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04"} Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.212413 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"16de902a-2867-4549-b244-3a3752df534a","Type":"ContainerStarted","Data":"9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254"} Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.237086 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.237069365 podStartE2EDuration="3.237069365s" podCreationTimestamp="2026-03-07 04:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:41:22.229064859 +0000 UTC m=+1327.275448388" watchObservedRunningTime="2026-03-07 04:41:22.237069365 +0000 UTC m=+1327.283452854" Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.258302 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.258282576 podStartE2EDuration="3.258282576s" podCreationTimestamp="2026-03-07 04:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:41:22.256859407 +0000 UTC m=+1327.303242906" watchObservedRunningTime="2026-03-07 04:41:22.258282576 +0000 UTC m=+1327.304666085" Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.285053 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.285036046 podStartE2EDuration="3.285036046s" podCreationTimestamp="2026-03-07 04:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:41:22.278701776 +0000 UTC m=+1327.325085285" watchObservedRunningTime="2026-03-07 04:41:22.285036046 +0000 UTC m=+1327.331419525" Mar 07 04:41:22 crc kubenswrapper[4689]: I0307 04:41:22.324104 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=3.324075357 podStartE2EDuration="3.324075357s" podCreationTimestamp="2026-03-07 04:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:41:22.316996877 +0000 UTC m=+1327.363380406" watchObservedRunningTime="2026-03-07 04:41:22.324075357 +0000 UTC m=+1327.370458886" Mar 07 04:41:29 crc kubenswrapper[4689]: I0307 04:41:29.189447 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:41:29 crc kubenswrapper[4689]: I0307 04:41:29.190075 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:41:30 crc kubenswrapper[4689]: I0307 04:41:30.695965 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:30 crc kubenswrapper[4689]: I0307 04:41:30.696528 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:30 crc kubenswrapper[4689]: I0307 04:41:30.735601 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:30 crc kubenswrapper[4689]: I0307 04:41:30.740425 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:30 crc kubenswrapper[4689]: I0307 04:41:30.976984 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:30 crc kubenswrapper[4689]: I0307 04:41:30.977052 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.010242 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.048419 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.052418 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.052503 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.070641 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.070705 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.090141 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.120113 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.123350 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.131454 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.299968 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.300032 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.300371 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.300414 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.300444 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.300468 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.300489 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:31 crc kubenswrapper[4689]: I0307 04:41:31.300508 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.237640 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.255031 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.314131 4689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.314190 4689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.314144 4689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.314213 4689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.327248 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.382850 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.382973 4689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.462473 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.495762 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.540849 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:33 crc kubenswrapper[4689]: I0307 04:41:33.753601 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:35 crc kubenswrapper[4689]: I0307 04:41:35.097062 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Mar 07 04:41:35 crc kubenswrapper[4689]: I0307 04:41:35.103698 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:41:35 crc kubenswrapper[4689]: I0307 04:41:35.309356 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Mar 07 04:41:35 crc kubenswrapper[4689]: I0307 04:41:35.329953 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:41:35 crc kubenswrapper[4689]: I0307 04:41:35.334323 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="1190d2df-d031-4c4d-87cf-237581d0cc4c" containerName="glance-log" containerID="cri-o://5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66" gracePeriod=30 Mar 07 04:41:35 crc kubenswrapper[4689]: I0307 04:41:35.334701 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="16de902a-2867-4549-b244-3a3752df534a" containerName="glance-log" containerID="cri-o://9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254" gracePeriod=30 Mar 07 04:41:35 crc kubenswrapper[4689]: I0307 04:41:35.334836 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="16de902a-2867-4549-b244-3a3752df534a" containerName="glance-httpd" containerID="cri-o://16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04" gracePeriod=30 Mar 07 04:41:35 crc kubenswrapper[4689]: I0307 04:41:35.334850 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="1190d2df-d031-4c4d-87cf-237581d0cc4c" containerName="glance-httpd" containerID="cri-o://53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb" gracePeriod=30 Mar 07 04:41:35 crc kubenswrapper[4689]: I0307 04:41:35.343985 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="16de902a-2867-4549-b244-3a3752df534a" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.136:9292/healthcheck\": EOF" Mar 07 04:41:35 crc kubenswrapper[4689]: I0307 04:41:35.344107 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="16de902a-2867-4549-b244-3a3752df534a" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.136:9292/healthcheck\": EOF" Mar 07 04:41:36 crc kubenswrapper[4689]: I0307 04:41:36.350608 4689 generic.go:334] "Generic (PLEG): container finished" podID="16de902a-2867-4549-b244-3a3752df534a" containerID="9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254" exitCode=143 Mar 07 04:41:36 crc kubenswrapper[4689]: I0307 04:41:36.350675 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"16de902a-2867-4549-b244-3a3752df534a","Type":"ContainerDied","Data":"9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254"} Mar 07 04:41:36 crc kubenswrapper[4689]: I0307 04:41:36.353086 4689 generic.go:334] "Generic (PLEG): container finished" podID="1190d2df-d031-4c4d-87cf-237581d0cc4c" containerID="5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66" exitCode=143 Mar 07 04:41:36 crc kubenswrapper[4689]: I0307 04:41:36.353284 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1190d2df-d031-4c4d-87cf-237581d0cc4c","Type":"ContainerDied","Data":"5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66"} Mar 07 04:41:36 crc kubenswrapper[4689]: I0307 04:41:36.353426 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="d4e746e3-9e78-4346-bead-cfe055fccbe6" containerName="glance-httpd" containerID="cri-o://f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2" gracePeriod=30 Mar 07 04:41:36 crc kubenswrapper[4689]: I0307 04:41:36.353339 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="d4e746e3-9e78-4346-bead-cfe055fccbe6" containerName="glance-log" containerID="cri-o://75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd" gracePeriod=30 Mar 07 04:41:36 crc kubenswrapper[4689]: I0307 04:41:36.353804 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="e8a61d36-81f1-41d7-b61e-29969a17ddc1" containerName="glance-log" containerID="cri-o://89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483" gracePeriod=30 Mar 07 04:41:36 crc kubenswrapper[4689]: I0307 04:41:36.353829 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="e8a61d36-81f1-41d7-b61e-29969a17ddc1" containerName="glance-httpd" containerID="cri-o://be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21" gracePeriod=30 Mar 07 04:41:37 crc kubenswrapper[4689]: I0307 04:41:37.367990 4689 generic.go:334] "Generic (PLEG): container finished" podID="d4e746e3-9e78-4346-bead-cfe055fccbe6" containerID="75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd" exitCode=143 Mar 07 04:41:37 crc kubenswrapper[4689]: I0307 04:41:37.368098 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"d4e746e3-9e78-4346-bead-cfe055fccbe6","Type":"ContainerDied","Data":"75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd"} Mar 07 04:41:37 crc kubenswrapper[4689]: I0307 04:41:37.372258 4689 generic.go:334] "Generic (PLEG): container finished" podID="e8a61d36-81f1-41d7-b61e-29969a17ddc1" containerID="89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483" exitCode=143 Mar 07 04:41:37 crc kubenswrapper[4689]: I0307 04:41:37.372338 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e8a61d36-81f1-41d7-b61e-29969a17ddc1","Type":"ContainerDied","Data":"89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483"} Mar 07 04:41:38 crc kubenswrapper[4689]: I0307 04:41:38.936230 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.061469 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065329 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-sys\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065387 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-sys" (OuterVolumeSpecName: "sys") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065392 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-dev\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065423 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-dev" (OuterVolumeSpecName: "dev") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065459 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1190d2df-d031-4c4d-87cf-237581d0cc4c-httpd-run\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065495 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jss4p\" (UniqueName: \"kubernetes.io/projected/1190d2df-d031-4c4d-87cf-237581d0cc4c-kube-api-access-jss4p\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065510 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-lib-modules\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065534 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-etc-nvme\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065551 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1190d2df-d031-4c4d-87cf-237581d0cc4c-logs\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065575 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065602 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-run\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065647 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065680 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-var-locks-brick\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065706 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-etc-iscsi\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065725 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1190d2df-d031-4c4d-87cf-237581d0cc4c-scripts\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065741 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065761 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1190d2df-d031-4c4d-87cf-237581d0cc4c-config-data\") pod \"1190d2df-d031-4c4d-87cf-237581d0cc4c\" (UID: \"1190d2df-d031-4c4d-87cf-237581d0cc4c\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.066032 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.066044 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.066052 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065705 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.065847 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1190d2df-d031-4c4d-87cf-237581d0cc4c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.066049 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1190d2df-d031-4c4d-87cf-237581d0cc4c-logs" (OuterVolumeSpecName: "logs") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.066076 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.066088 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-run" (OuterVolumeSpecName: "run") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.066101 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.070553 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1190d2df-d031-4c4d-87cf-237581d0cc4c-scripts" (OuterVolumeSpecName: "scripts") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.070554 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.070843 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1190d2df-d031-4c4d-87cf-237581d0cc4c-kube-api-access-jss4p" (OuterVolumeSpecName: "kube-api-access-jss4p") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "kube-api-access-jss4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.072871 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.109991 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1190d2df-d031-4c4d-87cf-237581d0cc4c-config-data" (OuterVolumeSpecName: "config-data") pod "1190d2df-d031-4c4d-87cf-237581d0cc4c" (UID: "1190d2df-d031-4c4d-87cf-237581d0cc4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167066 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16de902a-2867-4549-b244-3a3752df534a-logs\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167314 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-run\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167398 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16de902a-2867-4549-b244-3a3752df534a-httpd-run\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167492 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzj45\" (UniqueName: \"kubernetes.io/projected/16de902a-2867-4549-b244-3a3752df534a-kube-api-access-xzj45\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167598 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16de902a-2867-4549-b244-3a3752df534a-logs" (OuterVolumeSpecName: "logs") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167614 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16de902a-2867-4549-b244-3a3752df534a-config-data\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167678 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167703 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-sys\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167719 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-lib-modules\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167732 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-dev\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167778 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167797 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-etc-iscsi\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167816 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-run" (OuterVolumeSpecName: "run") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167851 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16de902a-2867-4549-b244-3a3752df534a-scripts\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167879 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-etc-nvme\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.167926 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-var-locks-brick\") pod \"16de902a-2867-4549-b244-3a3752df534a\" (UID: \"16de902a-2867-4549-b244-3a3752df534a\") " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168268 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-dev" (OuterVolumeSpecName: "dev") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168543 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jss4p\" (UniqueName: \"kubernetes.io/projected/1190d2df-d031-4c4d-87cf-237581d0cc4c-kube-api-access-jss4p\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168561 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168570 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1190d2df-d031-4c4d-87cf-237581d0cc4c-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168587 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168597 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168605 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16de902a-2867-4549-b244-3a3752df534a-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168613 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168643 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168651 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1190d2df-d031-4c4d-87cf-237581d0cc4c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168660 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1190d2df-d031-4c4d-87cf-237581d0cc4c-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168672 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168681 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1190d2df-d031-4c4d-87cf-237581d0cc4c-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168690 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168699 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1190d2df-d031-4c4d-87cf-237581d0cc4c-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168693 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16de902a-2867-4549-b244-3a3752df534a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168753 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-sys" (OuterVolumeSpecName: "sys") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.168792 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.169306 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.169377 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.169443 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.171912 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16de902a-2867-4549-b244-3a3752df534a-kube-api-access-xzj45" (OuterVolumeSpecName: "kube-api-access-xzj45") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "kube-api-access-xzj45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.171934 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.172761 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16de902a-2867-4549-b244-3a3752df534a-scripts" (OuterVolumeSpecName: "scripts") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.174936 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.192912 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.195525 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.221937 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16de902a-2867-4549-b244-3a3752df534a-config-data" (OuterVolumeSpecName: "config-data") pod "16de902a-2867-4549-b244-3a3752df534a" (UID: "16de902a-2867-4549-b244-3a3752df534a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270139 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270198 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzj45\" (UniqueName: \"kubernetes.io/projected/16de902a-2867-4549-b244-3a3752df534a-kube-api-access-xzj45\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270214 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16de902a-2867-4549-b244-3a3752df534a-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270256 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270276 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270285 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270299 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270309 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270319 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16de902a-2867-4549-b244-3a3752df534a-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270330 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270338 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270349 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/16de902a-2867-4549-b244-3a3752df534a-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.270370 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16de902a-2867-4549-b244-3a3752df534a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.283059 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.287636 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.371410 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.371449 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.394849 4689 generic.go:334] "Generic (PLEG): container finished" podID="1190d2df-d031-4c4d-87cf-237581d0cc4c" containerID="53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb" exitCode=0 Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.394916 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1190d2df-d031-4c4d-87cf-237581d0cc4c","Type":"ContainerDied","Data":"53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb"} Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.394979 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"1190d2df-d031-4c4d-87cf-237581d0cc4c","Type":"ContainerDied","Data":"24c2a0bce7c5d7e37c7b6fbef9c99fb74d6941e3431dd8389a555bc713d42a47"} Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.394979 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.395000 4689 scope.go:117] "RemoveContainer" containerID="53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.403707 4689 generic.go:334] "Generic (PLEG): container finished" podID="16de902a-2867-4549-b244-3a3752df534a" containerID="16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04" exitCode=0 Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.403757 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"16de902a-2867-4549-b244-3a3752df534a","Type":"ContainerDied","Data":"16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04"} Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.403789 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.403815 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"16de902a-2867-4549-b244-3a3752df534a","Type":"ContainerDied","Data":"316689990ab62ddad95d6cb8d7c610f68ec3d3c8aa948eb94a45307e8a176fcc"} Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.421276 4689 scope.go:117] "RemoveContainer" containerID="5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.440248 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.445464 4689 scope.go:117] "RemoveContainer" containerID="53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb" Mar 07 04:41:39 crc kubenswrapper[4689]: E0307 04:41:39.446075 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb\": container with ID starting with 53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb not found: ID does not exist" containerID="53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.446143 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb"} err="failed to get container status \"53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb\": rpc error: code = NotFound desc = could not find container \"53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb\": container with ID starting with 53fbe763f89ef738696352c514f2b4ee47b3318f90152313b6d508969386f0fb not found: ID does not exist" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.446186 4689 scope.go:117] "RemoveContainer" containerID="5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66" Mar 07 04:41:39 crc kubenswrapper[4689]: E0307 04:41:39.448624 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66\": container with ID starting with 5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66 not found: ID does not exist" containerID="5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.448686 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66"} err="failed to get container status \"5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66\": rpc error: code = NotFound desc = could not find container \"5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66\": container with ID starting with 5468b73cdb1e265ee21159e29da158544543ef308cf97934a184c3dccd0e7e66 not found: ID does not exist" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.448706 4689 scope.go:117] "RemoveContainer" containerID="16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.456516 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.470869 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.482688 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.489127 4689 scope.go:117] "RemoveContainer" containerID="9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.508736 4689 scope.go:117] "RemoveContainer" containerID="16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04" Mar 07 04:41:39 crc kubenswrapper[4689]: E0307 04:41:39.509357 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04\": container with ID starting with 16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04 not found: ID does not exist" containerID="16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.509394 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04"} err="failed to get container status \"16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04\": rpc error: code = NotFound desc = could not find container \"16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04\": container with ID starting with 16edf5dcc7d9288a3364f990fd6df1d0353c2cc62efe5358a61b0f9c67988d04 not found: ID does not exist" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.509422 4689 scope.go:117] "RemoveContainer" containerID="9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254" Mar 07 04:41:39 crc kubenswrapper[4689]: E0307 04:41:39.509811 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254\": container with ID starting with 9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254 not found: ID does not exist" containerID="9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.509853 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254"} err="failed to get container status \"9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254\": rpc error: code = NotFound desc = could not find container \"9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254\": container with ID starting with 9ec4579a9c454761286f961617174bad51ba4ab73cc4a93027042af10affa254 not found: ID does not exist" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.838915 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1190d2df-d031-4c4d-87cf-237581d0cc4c" path="/var/lib/kubelet/pods/1190d2df-d031-4c4d-87cf-237581d0cc4c/volumes" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.839810 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16de902a-2867-4549-b244-3a3752df534a" path="/var/lib/kubelet/pods/16de902a-2867-4549-b244-3a3752df534a/volumes" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.885692 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:39 crc kubenswrapper[4689]: I0307 04:41:39.955281 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088097 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-etc-iscsi\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088268 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088290 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-etc-nvme\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088382 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088473 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088542 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-var-locks-brick\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088621 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-etc-iscsi\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088724 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-run\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088825 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-dev\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088866 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-sys\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089020 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a61d36-81f1-41d7-b61e-29969a17ddc1-logs\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089114 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8a61d36-81f1-41d7-b61e-29969a17ddc1-httpd-run\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089234 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-sys\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088419 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088713 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088758 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088865 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-dev" (OuterVolumeSpecName: "dev") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089314 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-dev\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089421 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8tm5\" (UniqueName: \"kubernetes.io/projected/e8a61d36-81f1-41d7-b61e-29969a17ddc1-kube-api-access-b8tm5\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088870 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-run" (OuterVolumeSpecName: "run") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089455 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a61d36-81f1-41d7-b61e-29969a17ddc1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.088974 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-sys" (OuterVolumeSpecName: "sys") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089349 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-dev" (OuterVolumeSpecName: "dev") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089390 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-sys" (OuterVolumeSpecName: "sys") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089504 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-var-locks-brick\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089563 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a61d36-81f1-41d7-b61e-29969a17ddc1-logs" (OuterVolumeSpecName: "logs") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089580 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089586 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a61d36-81f1-41d7-b61e-29969a17ddc1-scripts\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089631 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4e746e3-9e78-4346-bead-cfe055fccbe6-logs\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089711 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a61d36-81f1-41d7-b61e-29969a17ddc1-config-data\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089749 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4e746e3-9e78-4346-bead-cfe055fccbe6-httpd-run\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089797 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzwjz\" (UniqueName: \"kubernetes.io/projected/d4e746e3-9e78-4346-bead-cfe055fccbe6-kube-api-access-gzwjz\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089878 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e746e3-9e78-4346-bead-cfe055fccbe6-config-data\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089914 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-etc-nvme\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.089963 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-run\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090004 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-lib-modules\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090033 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090075 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4e746e3-9e78-4346-bead-cfe055fccbe6-logs" (OuterVolumeSpecName: "logs") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090077 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090123 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e746e3-9e78-4346-bead-cfe055fccbe6-scripts\") pod \"d4e746e3-9e78-4346-bead-cfe055fccbe6\" (UID: \"d4e746e3-9e78-4346-bead-cfe055fccbe6\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090145 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090206 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-lib-modules\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090251 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\" (UID: \"e8a61d36-81f1-41d7-b61e-29969a17ddc1\") " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090216 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-run" (OuterVolumeSpecName: "run") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090307 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4e746e3-9e78-4346-bead-cfe055fccbe6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090384 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090860 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090892 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090910 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090928 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090946 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090963 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.090982 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.091001 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.091017 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.091033 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.091047 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.091064 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a61d36-81f1-41d7-b61e-29969a17ddc1-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.091080 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8a61d36-81f1-41d7-b61e-29969a17ddc1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.091096 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8a61d36-81f1-41d7-b61e-29969a17ddc1-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.091112 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.091129 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4e746e3-9e78-4346-bead-cfe055fccbe6-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.091145 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4e746e3-9e78-4346-bead-cfe055fccbe6-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.091161 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4e746e3-9e78-4346-bead-cfe055fccbe6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.097329 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.097331 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e746e3-9e78-4346-bead-cfe055fccbe6-scripts" (OuterVolumeSpecName: "scripts") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.097354 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.097331 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.099616 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a61d36-81f1-41d7-b61e-29969a17ddc1-kube-api-access-b8tm5" (OuterVolumeSpecName: "kube-api-access-b8tm5") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "kube-api-access-b8tm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.100259 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e746e3-9e78-4346-bead-cfe055fccbe6-kube-api-access-gzwjz" (OuterVolumeSpecName: "kube-api-access-gzwjz") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "kube-api-access-gzwjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.101689 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a61d36-81f1-41d7-b61e-29969a17ddc1-scripts" (OuterVolumeSpecName: "scripts") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.105162 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.149007 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e746e3-9e78-4346-bead-cfe055fccbe6-config-data" (OuterVolumeSpecName: "config-data") pod "d4e746e3-9e78-4346-bead-cfe055fccbe6" (UID: "d4e746e3-9e78-4346-bead-cfe055fccbe6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.168939 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a61d36-81f1-41d7-b61e-29969a17ddc1-config-data" (OuterVolumeSpecName: "config-data") pod "e8a61d36-81f1-41d7-b61e-29969a17ddc1" (UID: "e8a61d36-81f1-41d7-b61e-29969a17ddc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.192662 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8tm5\" (UniqueName: \"kubernetes.io/projected/e8a61d36-81f1-41d7-b61e-29969a17ddc1-kube-api-access-b8tm5\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.192706 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a61d36-81f1-41d7-b61e-29969a17ddc1-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.192725 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a61d36-81f1-41d7-b61e-29969a17ddc1-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.192741 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzwjz\" (UniqueName: \"kubernetes.io/projected/d4e746e3-9e78-4346-bead-cfe055fccbe6-kube-api-access-gzwjz\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.192757 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e746e3-9e78-4346-bead-cfe055fccbe6-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.192810 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.192827 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e746e3-9e78-4346-bead-cfe055fccbe6-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.192852 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.192874 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.192897 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.215509 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.219377 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.219614 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.220396 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.294549 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.294602 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.294614 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.294625 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.416566 4689 generic.go:334] "Generic (PLEG): container finished" podID="d4e746e3-9e78-4346-bead-cfe055fccbe6" containerID="f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2" exitCode=0 Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.416669 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.416686 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"d4e746e3-9e78-4346-bead-cfe055fccbe6","Type":"ContainerDied","Data":"f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2"} Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.416714 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"d4e746e3-9e78-4346-bead-cfe055fccbe6","Type":"ContainerDied","Data":"46a27ed3cea0106aef2356d6e505c9a213a72c88e42ec950ee3c28ba41afb185"} Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.416733 4689 scope.go:117] "RemoveContainer" containerID="f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.419881 4689 generic.go:334] "Generic (PLEG): container finished" podID="e8a61d36-81f1-41d7-b61e-29969a17ddc1" containerID="be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21" exitCode=0 Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.419970 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e8a61d36-81f1-41d7-b61e-29969a17ddc1","Type":"ContainerDied","Data":"be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21"} Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.419991 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.420010 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"e8a61d36-81f1-41d7-b61e-29969a17ddc1","Type":"ContainerDied","Data":"186f8005b849e2f4df919c3b86667cc32b19122116f9c75abb96c9fe95175bd6"} Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.471611 4689 scope.go:117] "RemoveContainer" containerID="75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.489023 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.497606 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.511968 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.529740 4689 scope.go:117] "RemoveContainer" containerID="f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2" Mar 07 04:41:40 crc kubenswrapper[4689]: E0307 04:41:40.534651 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2\": container with ID starting with f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2 not found: ID does not exist" containerID="f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.534728 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2"} err="failed to get container status \"f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2\": rpc error: code = NotFound desc = could not find container \"f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2\": container with ID starting with f82965625f8eb61d259d6e69ed17ce67026feb254fce02dc3530915e9da016d2 not found: ID does not exist" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.534773 4689 scope.go:117] "RemoveContainer" containerID="75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.535358 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Mar 07 04:41:40 crc kubenswrapper[4689]: E0307 04:41:40.535605 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd\": container with ID starting with 75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd not found: ID does not exist" containerID="75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.535651 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd"} err="failed to get container status \"75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd\": rpc error: code = NotFound desc = could not find container \"75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd\": container with ID starting with 75afbdb6e40097d859a1ff49723be0b61f87cc6d433f898c8a50c94d50f0d7dd not found: ID does not exist" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.535679 4689 scope.go:117] "RemoveContainer" containerID="be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.556728 4689 scope.go:117] "RemoveContainer" containerID="89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.580420 4689 scope.go:117] "RemoveContainer" containerID="be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21" Mar 07 04:41:40 crc kubenswrapper[4689]: E0307 04:41:40.581473 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21\": container with ID starting with be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21 not found: ID does not exist" containerID="be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.581527 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21"} err="failed to get container status \"be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21\": rpc error: code = NotFound desc = could not find container \"be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21\": container with ID starting with be3e284cdc2bca499c54af5067b7fff0268a6c7461f071593b66064631b5ee21 not found: ID does not exist" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.581580 4689 scope.go:117] "RemoveContainer" containerID="89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483" Mar 07 04:41:40 crc kubenswrapper[4689]: E0307 04:41:40.581918 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483\": container with ID starting with 89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483 not found: ID does not exist" containerID="89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483" Mar 07 04:41:40 crc kubenswrapper[4689]: I0307 04:41:40.581955 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483"} err="failed to get container status \"89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483\": rpc error: code = NotFound desc = could not find container \"89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483\": container with ID starting with 89a731d28dbf806ee427fcea0900decdba38dd868c85d9b1d6bd90598f016483 not found: ID does not exist" Mar 07 04:41:41 crc kubenswrapper[4689]: I0307 04:41:41.840930 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e746e3-9e78-4346-bead-cfe055fccbe6" path="/var/lib/kubelet/pods/d4e746e3-9e78-4346-bead-cfe055fccbe6/volumes" Mar 07 04:41:41 crc kubenswrapper[4689]: I0307 04:41:41.842336 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a61d36-81f1-41d7-b61e-29969a17ddc1" path="/var/lib/kubelet/pods/e8a61d36-81f1-41d7-b61e-29969a17ddc1/volumes" Mar 07 04:41:41 crc kubenswrapper[4689]: I0307 04:41:41.972075 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:41:41 crc kubenswrapper[4689]: I0307 04:41:41.972539 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="52c06d64-fd58-4794-8a36-e3036d6d728f" containerName="glance-log" containerID="cri-o://7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116" gracePeriod=30 Mar 07 04:41:41 crc kubenswrapper[4689]: I0307 04:41:41.972614 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="52c06d64-fd58-4794-8a36-e3036d6d728f" containerName="glance-httpd" containerID="cri-o://1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38" gracePeriod=30 Mar 07 04:41:42 crc kubenswrapper[4689]: I0307 04:41:42.449838 4689 generic.go:334] "Generic (PLEG): container finished" podID="52c06d64-fd58-4794-8a36-e3036d6d728f" containerID="7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116" exitCode=143 Mar 07 04:41:42 crc kubenswrapper[4689]: I0307 04:41:42.449886 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"52c06d64-fd58-4794-8a36-e3036d6d728f","Type":"ContainerDied","Data":"7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116"} Mar 07 04:41:42 crc kubenswrapper[4689]: I0307 04:41:42.494569 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:41:42 crc kubenswrapper[4689]: I0307 04:41:42.494923 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" containerName="glance-log" containerID="cri-o://c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b" gracePeriod=30 Mar 07 04:41:42 crc kubenswrapper[4689]: I0307 04:41:42.495078 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" containerName="glance-httpd" containerID="cri-o://908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31" gracePeriod=30 Mar 07 04:41:43 crc kubenswrapper[4689]: I0307 04:41:43.463717 4689 generic.go:334] "Generic (PLEG): container finished" podID="f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" containerID="c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b" exitCode=143 Mar 07 04:41:43 crc kubenswrapper[4689]: I0307 04:41:43.463832 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6","Type":"ContainerDied","Data":"c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b"} Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.452976 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.470869 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c06d64-fd58-4794-8a36-e3036d6d728f-config-data\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.470945 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471021 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c06d64-fd58-4794-8a36-e3036d6d728f-logs\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471092 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-var-locks-brick\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471136 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-dev\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471224 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-lib-modules\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471298 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82tp5\" (UniqueName: \"kubernetes.io/projected/52c06d64-fd58-4794-8a36-e3036d6d728f-kube-api-access-82tp5\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471358 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-etc-iscsi\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471420 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-run\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471481 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52c06d64-fd58-4794-8a36-e3036d6d728f-httpd-run\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471538 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c06d64-fd58-4794-8a36-e3036d6d728f-scripts\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471570 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-sys\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471600 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471636 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-etc-nvme\") pod \"52c06d64-fd58-4794-8a36-e3036d6d728f\" (UID: \"52c06d64-fd58-4794-8a36-e3036d6d728f\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471771 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-run" (OuterVolumeSpecName: "run") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471808 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.471879 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-dev" (OuterVolumeSpecName: "dev") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.472017 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.472034 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.472048 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.472075 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.472104 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-sys" (OuterVolumeSpecName: "sys") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.472330 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c06d64-fd58-4794-8a36-e3036d6d728f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.473321 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.474648 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.489398 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c06d64-fd58-4794-8a36-e3036d6d728f-scripts" (OuterVolumeSpecName: "scripts") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.489670 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c06d64-fd58-4794-8a36-e3036d6d728f-kube-api-access-82tp5" (OuterVolumeSpecName: "kube-api-access-82tp5") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "kube-api-access-82tp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.489894 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c06d64-fd58-4794-8a36-e3036d6d728f-logs" (OuterVolumeSpecName: "logs") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.492901 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.514135 4689 generic.go:334] "Generic (PLEG): container finished" podID="52c06d64-fd58-4794-8a36-e3036d6d728f" containerID="1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38" exitCode=0 Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.514213 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"52c06d64-fd58-4794-8a36-e3036d6d728f","Type":"ContainerDied","Data":"1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38"} Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.514243 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"52c06d64-fd58-4794-8a36-e3036d6d728f","Type":"ContainerDied","Data":"0e8fb8f09cadbc4d6110cad6b839d3cf24a01765b5a321c4cb30dc3c7c0baf2d"} Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.514266 4689 scope.go:117] "RemoveContainer" containerID="1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.514461 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.524372 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance-cache") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.569274 4689 scope.go:117] "RemoveContainer" containerID="7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.572707 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52c06d64-fd58-4794-8a36-e3036d6d728f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.572735 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c06d64-fd58-4794-8a36-e3036d6d728f-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.572748 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.572771 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.572783 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.572800 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.572811 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c06d64-fd58-4794-8a36-e3036d6d728f-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.572823 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.572836 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82tp5\" (UniqueName: \"kubernetes.io/projected/52c06d64-fd58-4794-8a36-e3036d6d728f-kube-api-access-82tp5\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.572848 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/52c06d64-fd58-4794-8a36-e3036d6d728f-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.579877 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c06d64-fd58-4794-8a36-e3036d6d728f-config-data" (OuterVolumeSpecName: "config-data") pod "52c06d64-fd58-4794-8a36-e3036d6d728f" (UID: "52c06d64-fd58-4794-8a36-e3036d6d728f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.583425 4689 scope.go:117] "RemoveContainer" containerID="1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38" Mar 07 04:41:45 crc kubenswrapper[4689]: E0307 04:41:45.583832 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38\": container with ID starting with 1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38 not found: ID does not exist" containerID="1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.583868 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38"} err="failed to get container status \"1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38\": rpc error: code = NotFound desc = could not find container \"1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38\": container with ID starting with 1a5a29a4c16368ad900678f93ab68d2c409fe0ce92e0e1e5a0bc133ac28cff38 not found: ID does not exist" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.583896 4689 scope.go:117] "RemoveContainer" containerID="7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116" Mar 07 04:41:45 crc kubenswrapper[4689]: E0307 04:41:45.584514 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116\": container with ID starting with 7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116 not found: ID does not exist" containerID="7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.584536 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116"} err="failed to get container status \"7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116\": rpc error: code = NotFound desc = could not find container \"7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116\": container with ID starting with 7e8babe5b1cd8be833ab6360e71decb6cdbecedad4c83c036cbfbb1a0be1a116 not found: ID does not exist" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.590989 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.592422 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.674494 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.674532 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c06d64-fd58-4794-8a36-e3036d6d728f-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.674546 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.848392 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.854754 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.937225 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.981758 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-dev\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.981812 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-dev" (OuterVolumeSpecName: "dev") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.981827 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-etc-nvme\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.981895 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.981975 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.981994 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-sys\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982060 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-sys" (OuterVolumeSpecName: "sys") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982447 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-scripts\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982483 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-httpd-run\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982519 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mkjg\" (UniqueName: \"kubernetes.io/projected/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-kube-api-access-5mkjg\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982535 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-etc-iscsi\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982554 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-logs\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982573 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-var-locks-brick\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982595 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-config-data\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982624 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982642 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-lib-modules\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982665 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-run\") pod \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\" (UID: \"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6\") " Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982892 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982903 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982911 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.982932 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-run" (OuterVolumeSpecName: "run") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.983144 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.983349 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.983388 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.983900 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-logs" (OuterVolumeSpecName: "logs") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.984254 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.986437 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.986807 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-scripts" (OuterVolumeSpecName: "scripts") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.986931 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-kube-api-access-5mkjg" (OuterVolumeSpecName: "kube-api-access-5mkjg") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "kube-api-access-5mkjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:41:45 crc kubenswrapper[4689]: I0307 04:41:45.989234 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance-cache") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.022055 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-config-data" (OuterVolumeSpecName: "config-data") pod "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" (UID: "f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.083643 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.083705 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.083720 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.083731 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.083745 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mkjg\" (UniqueName: \"kubernetes.io/projected/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-kube-api-access-5mkjg\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.083758 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.083769 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.083780 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.083792 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.083809 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.083820 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.097619 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.102279 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.184509 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.184541 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.526009 4689 generic.go:334] "Generic (PLEG): container finished" podID="f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" containerID="908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31" exitCode=0 Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.526070 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6","Type":"ContainerDied","Data":"908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31"} Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.526146 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6","Type":"ContainerDied","Data":"2a3b0529581393e12bc5f398fc24ec0b92866625257ee208289ff24832501cf2"} Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.526094 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.526223 4689 scope.go:117] "RemoveContainer" containerID="908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.574382 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.576808 4689 scope.go:117] "RemoveContainer" containerID="c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.583305 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.595364 4689 scope.go:117] "RemoveContainer" containerID="908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31" Mar 07 04:41:46 crc kubenswrapper[4689]: E0307 04:41:46.595587 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31\": container with ID starting with 908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31 not found: ID does not exist" containerID="908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.595620 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31"} err="failed to get container status \"908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31\": rpc error: code = NotFound desc = could not find container \"908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31\": container with ID starting with 908788ae7a45e827b8da512ae5a09e5db6f6dca396b7cc8a911b5a83aeb34f31 not found: ID does not exist" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.595643 4689 scope.go:117] "RemoveContainer" containerID="c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b" Mar 07 04:41:46 crc kubenswrapper[4689]: E0307 04:41:46.595833 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b\": container with ID starting with c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b not found: ID does not exist" containerID="c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b" Mar 07 04:41:46 crc kubenswrapper[4689]: I0307 04:41:46.595857 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b"} err="failed to get container status \"c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b\": rpc error: code = NotFound desc = could not find container \"c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b\": container with ID starting with c04cdd8f7253675eb5c69cbd46dc1332ceb44e60706697228de54bc182f79e6b not found: ID does not exist" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.086331 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-p2pl5"] Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.091259 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-p2pl5"] Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142232 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance99c7-account-delete-l2v7k"] Mar 07 04:41:47 crc kubenswrapper[4689]: E0307 04:41:47.142543 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1190d2df-d031-4c4d-87cf-237581d0cc4c" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142567 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="1190d2df-d031-4c4d-87cf-237581d0cc4c" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: E0307 04:41:47.142581 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c06d64-fd58-4794-8a36-e3036d6d728f" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142590 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c06d64-fd58-4794-8a36-e3036d6d728f" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: E0307 04:41:47.142611 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c06d64-fd58-4794-8a36-e3036d6d728f" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142619 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c06d64-fd58-4794-8a36-e3036d6d728f" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: E0307 04:41:47.142631 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142638 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: E0307 04:41:47.142650 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16de902a-2867-4549-b244-3a3752df534a" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142658 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="16de902a-2867-4549-b244-3a3752df534a" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: E0307 04:41:47.142668 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a61d36-81f1-41d7-b61e-29969a17ddc1" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142675 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a61d36-81f1-41d7-b61e-29969a17ddc1" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: E0307 04:41:47.142688 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e746e3-9e78-4346-bead-cfe055fccbe6" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142695 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e746e3-9e78-4346-bead-cfe055fccbe6" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: E0307 04:41:47.142707 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16de902a-2867-4549-b244-3a3752df534a" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142715 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="16de902a-2867-4549-b244-3a3752df534a" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: E0307 04:41:47.142728 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e746e3-9e78-4346-bead-cfe055fccbe6" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142734 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e746e3-9e78-4346-bead-cfe055fccbe6" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: E0307 04:41:47.142745 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a61d36-81f1-41d7-b61e-29969a17ddc1" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142752 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a61d36-81f1-41d7-b61e-29969a17ddc1" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: E0307 04:41:47.142770 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1190d2df-d031-4c4d-87cf-237581d0cc4c" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142778 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="1190d2df-d031-4c4d-87cf-237581d0cc4c" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: E0307 04:41:47.142790 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142800 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142941 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142960 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="16de902a-2867-4549-b244-3a3752df534a" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142971 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a61d36-81f1-41d7-b61e-29969a17ddc1" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142979 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e746e3-9e78-4346-bead-cfe055fccbe6" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.142991 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="16de902a-2867-4549-b244-3a3752df534a" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.143001 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.143011 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e746e3-9e78-4346-bead-cfe055fccbe6" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.143024 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="1190d2df-d031-4c4d-87cf-237581d0cc4c" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.143036 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c06d64-fd58-4794-8a36-e3036d6d728f" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.143045 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a61d36-81f1-41d7-b61e-29969a17ddc1" containerName="glance-log" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.143056 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c06d64-fd58-4794-8a36-e3036d6d728f" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.143063 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="1190d2df-d031-4c4d-87cf-237581d0cc4c" containerName="glance-httpd" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.143621 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.159381 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance99c7-account-delete-l2v7k"] Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.198761 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8tzh\" (UniqueName: \"kubernetes.io/projected/ea952c57-59fa-4e5c-a28c-16f6604f30b4-kube-api-access-k8tzh\") pod \"glance99c7-account-delete-l2v7k\" (UID: \"ea952c57-59fa-4e5c-a28c-16f6604f30b4\") " pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.198954 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea952c57-59fa-4e5c-a28c-16f6604f30b4-operator-scripts\") pod \"glance99c7-account-delete-l2v7k\" (UID: \"ea952c57-59fa-4e5c-a28c-16f6604f30b4\") " pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.301065 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea952c57-59fa-4e5c-a28c-16f6604f30b4-operator-scripts\") pod \"glance99c7-account-delete-l2v7k\" (UID: \"ea952c57-59fa-4e5c-a28c-16f6604f30b4\") " pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.301964 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8tzh\" (UniqueName: \"kubernetes.io/projected/ea952c57-59fa-4e5c-a28c-16f6604f30b4-kube-api-access-k8tzh\") pod \"glance99c7-account-delete-l2v7k\" (UID: \"ea952c57-59fa-4e5c-a28c-16f6604f30b4\") " pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.301875 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea952c57-59fa-4e5c-a28c-16f6604f30b4-operator-scripts\") pod \"glance99c7-account-delete-l2v7k\" (UID: \"ea952c57-59fa-4e5c-a28c-16f6604f30b4\") " pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.327984 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8tzh\" (UniqueName: \"kubernetes.io/projected/ea952c57-59fa-4e5c-a28c-16f6604f30b4-kube-api-access-k8tzh\") pod \"glance99c7-account-delete-l2v7k\" (UID: \"ea952c57-59fa-4e5c-a28c-16f6604f30b4\") " pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.462200 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.843283 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52c06d64-fd58-4794-8a36-e3036d6d728f" path="/var/lib/kubelet/pods/52c06d64-fd58-4794-8a36-e3036d6d728f/volumes" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.844667 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a35dda23-58ef-41e5-828a-46b51a98acb7" path="/var/lib/kubelet/pods/a35dda23-58ef-41e5-828a-46b51a98acb7/volumes" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.846319 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6" path="/var/lib/kubelet/pods/f9ebe9af-b36d-48b6-b5a4-7a8b9031b7c6/volumes" Mar 07 04:41:47 crc kubenswrapper[4689]: I0307 04:41:47.888236 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance99c7-account-delete-l2v7k"] Mar 07 04:41:47 crc kubenswrapper[4689]: W0307 04:41:47.895280 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea952c57_59fa_4e5c_a28c_16f6604f30b4.slice/crio-07e0a925ff4f9a71d61787ded9b27004d40106b76ee33beaeb52c1a41ff0573c WatchSource:0}: Error finding container 07e0a925ff4f9a71d61787ded9b27004d40106b76ee33beaeb52c1a41ff0573c: Status 404 returned error can't find the container with id 07e0a925ff4f9a71d61787ded9b27004d40106b76ee33beaeb52c1a41ff0573c Mar 07 04:41:48 crc kubenswrapper[4689]: I0307 04:41:48.552940 4689 generic.go:334] "Generic (PLEG): container finished" podID="ea952c57-59fa-4e5c-a28c-16f6604f30b4" containerID="d38bf174162378fb2da9da159d97d13ebf904fb880d611cad3eb253e712520b8" exitCode=0 Mar 07 04:41:48 crc kubenswrapper[4689]: I0307 04:41:48.552983 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" event={"ID":"ea952c57-59fa-4e5c-a28c-16f6604f30b4","Type":"ContainerDied","Data":"d38bf174162378fb2da9da159d97d13ebf904fb880d611cad3eb253e712520b8"} Mar 07 04:41:48 crc kubenswrapper[4689]: I0307 04:41:48.553015 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" event={"ID":"ea952c57-59fa-4e5c-a28c-16f6604f30b4","Type":"ContainerStarted","Data":"07e0a925ff4f9a71d61787ded9b27004d40106b76ee33beaeb52c1a41ff0573c"} Mar 07 04:41:49 crc kubenswrapper[4689]: I0307 04:41:49.874437 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" Mar 07 04:41:49 crc kubenswrapper[4689]: I0307 04:41:49.954293 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8tzh\" (UniqueName: \"kubernetes.io/projected/ea952c57-59fa-4e5c-a28c-16f6604f30b4-kube-api-access-k8tzh\") pod \"ea952c57-59fa-4e5c-a28c-16f6604f30b4\" (UID: \"ea952c57-59fa-4e5c-a28c-16f6604f30b4\") " Mar 07 04:41:49 crc kubenswrapper[4689]: I0307 04:41:49.954563 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea952c57-59fa-4e5c-a28c-16f6604f30b4-operator-scripts\") pod \"ea952c57-59fa-4e5c-a28c-16f6604f30b4\" (UID: \"ea952c57-59fa-4e5c-a28c-16f6604f30b4\") " Mar 07 04:41:49 crc kubenswrapper[4689]: I0307 04:41:49.955159 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea952c57-59fa-4e5c-a28c-16f6604f30b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea952c57-59fa-4e5c-a28c-16f6604f30b4" (UID: "ea952c57-59fa-4e5c-a28c-16f6604f30b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:41:49 crc kubenswrapper[4689]: I0307 04:41:49.969382 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea952c57-59fa-4e5c-a28c-16f6604f30b4-kube-api-access-k8tzh" (OuterVolumeSpecName: "kube-api-access-k8tzh") pod "ea952c57-59fa-4e5c-a28c-16f6604f30b4" (UID: "ea952c57-59fa-4e5c-a28c-16f6604f30b4"). InnerVolumeSpecName "kube-api-access-k8tzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:41:50 crc kubenswrapper[4689]: I0307 04:41:50.056003 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8tzh\" (UniqueName: \"kubernetes.io/projected/ea952c57-59fa-4e5c-a28c-16f6604f30b4-kube-api-access-k8tzh\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:50 crc kubenswrapper[4689]: I0307 04:41:50.056053 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea952c57-59fa-4e5c-a28c-16f6604f30b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:50 crc kubenswrapper[4689]: I0307 04:41:50.583552 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" event={"ID":"ea952c57-59fa-4e5c-a28c-16f6604f30b4","Type":"ContainerDied","Data":"07e0a925ff4f9a71d61787ded9b27004d40106b76ee33beaeb52c1a41ff0573c"} Mar 07 04:41:50 crc kubenswrapper[4689]: I0307 04:41:50.583605 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07e0a925ff4f9a71d61787ded9b27004d40106b76ee33beaeb52c1a41ff0573c" Mar 07 04:41:50 crc kubenswrapper[4689]: I0307 04:41:50.583699 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance99c7-account-delete-l2v7k" Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.181274 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-prqhz"] Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.186986 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-prqhz"] Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.196427 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance99c7-account-delete-l2v7k"] Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.201728 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-99c7-account-create-update-65cxn"] Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.206450 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance99c7-account-delete-l2v7k"] Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.210729 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-99c7-account-create-update-65cxn"] Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.831461 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-s6mwz"] Mar 07 04:41:52 crc kubenswrapper[4689]: E0307 04:41:52.831755 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea952c57-59fa-4e5c-a28c-16f6604f30b4" containerName="mariadb-account-delete" Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.831770 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea952c57-59fa-4e5c-a28c-16f6604f30b4" containerName="mariadb-account-delete" Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.831914 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea952c57-59fa-4e5c-a28c-16f6604f30b4" containerName="mariadb-account-delete" Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.832403 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-s6mwz" Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.837600 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-3153-account-create-update-p4bp6"] Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.838409 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.839944 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.848180 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-s6mwz"] Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.853937 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3153-account-create-update-p4bp6"] Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.903406 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/badeb223-8f16-446f-8b09-ee99b5fc8ee7-operator-scripts\") pod \"glance-db-create-s6mwz\" (UID: \"badeb223-8f16-446f-8b09-ee99b5fc8ee7\") " pod="glance-kuttl-tests/glance-db-create-s6mwz" Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.903631 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmzz5\" (UniqueName: \"kubernetes.io/projected/badeb223-8f16-446f-8b09-ee99b5fc8ee7-kube-api-access-pmzz5\") pod \"glance-db-create-s6mwz\" (UID: \"badeb223-8f16-446f-8b09-ee99b5fc8ee7\") " pod="glance-kuttl-tests/glance-db-create-s6mwz" Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.903735 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpxdx\" (UniqueName: \"kubernetes.io/projected/d2e0245c-2824-46a4-8c46-e5dabccff5e5-kube-api-access-jpxdx\") pod \"glance-3153-account-create-update-p4bp6\" (UID: \"d2e0245c-2824-46a4-8c46-e5dabccff5e5\") " pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" Mar 07 04:41:52 crc kubenswrapper[4689]: I0307 04:41:52.903838 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2e0245c-2824-46a4-8c46-e5dabccff5e5-operator-scripts\") pod \"glance-3153-account-create-update-p4bp6\" (UID: \"d2e0245c-2824-46a4-8c46-e5dabccff5e5\") " pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.005242 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmzz5\" (UniqueName: \"kubernetes.io/projected/badeb223-8f16-446f-8b09-ee99b5fc8ee7-kube-api-access-pmzz5\") pod \"glance-db-create-s6mwz\" (UID: \"badeb223-8f16-446f-8b09-ee99b5fc8ee7\") " pod="glance-kuttl-tests/glance-db-create-s6mwz" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.005542 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpxdx\" (UniqueName: \"kubernetes.io/projected/d2e0245c-2824-46a4-8c46-e5dabccff5e5-kube-api-access-jpxdx\") pod \"glance-3153-account-create-update-p4bp6\" (UID: \"d2e0245c-2824-46a4-8c46-e5dabccff5e5\") " pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.005658 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2e0245c-2824-46a4-8c46-e5dabccff5e5-operator-scripts\") pod \"glance-3153-account-create-update-p4bp6\" (UID: \"d2e0245c-2824-46a4-8c46-e5dabccff5e5\") " pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.005789 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/badeb223-8f16-446f-8b09-ee99b5fc8ee7-operator-scripts\") pod \"glance-db-create-s6mwz\" (UID: \"badeb223-8f16-446f-8b09-ee99b5fc8ee7\") " pod="glance-kuttl-tests/glance-db-create-s6mwz" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.006815 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2e0245c-2824-46a4-8c46-e5dabccff5e5-operator-scripts\") pod \"glance-3153-account-create-update-p4bp6\" (UID: \"d2e0245c-2824-46a4-8c46-e5dabccff5e5\") " pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.007237 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/badeb223-8f16-446f-8b09-ee99b5fc8ee7-operator-scripts\") pod \"glance-db-create-s6mwz\" (UID: \"badeb223-8f16-446f-8b09-ee99b5fc8ee7\") " pod="glance-kuttl-tests/glance-db-create-s6mwz" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.025989 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpxdx\" (UniqueName: \"kubernetes.io/projected/d2e0245c-2824-46a4-8c46-e5dabccff5e5-kube-api-access-jpxdx\") pod \"glance-3153-account-create-update-p4bp6\" (UID: \"d2e0245c-2824-46a4-8c46-e5dabccff5e5\") " pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.026053 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmzz5\" (UniqueName: \"kubernetes.io/projected/badeb223-8f16-446f-8b09-ee99b5fc8ee7-kube-api-access-pmzz5\") pod \"glance-db-create-s6mwz\" (UID: \"badeb223-8f16-446f-8b09-ee99b5fc8ee7\") " pod="glance-kuttl-tests/glance-db-create-s6mwz" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.150670 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-s6mwz" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.157610 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.676936 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-s6mwz"] Mar 07 04:41:53 crc kubenswrapper[4689]: W0307 04:41:53.677759 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2e0245c_2824_46a4_8c46_e5dabccff5e5.slice/crio-26e25eb423c312c842c422d11fa938bdeb45ca6fd8533b4f952a5a939b922a0b WatchSource:0}: Error finding container 26e25eb423c312c842c422d11fa938bdeb45ca6fd8533b4f952a5a939b922a0b: Status 404 returned error can't find the container with id 26e25eb423c312c842c422d11fa938bdeb45ca6fd8533b4f952a5a939b922a0b Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.694907 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3153-account-create-update-p4bp6"] Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.836018 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e18750b-34ac-4c21-8b43-7b6dad049da6" path="/var/lib/kubelet/pods/7e18750b-34ac-4c21-8b43-7b6dad049da6/volumes" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.837081 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b14ab1c2-06d1-4279-bfa3-56af17e87ec2" path="/var/lib/kubelet/pods/b14ab1c2-06d1-4279-bfa3-56af17e87ec2/volumes" Mar 07 04:41:53 crc kubenswrapper[4689]: I0307 04:41:53.837756 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea952c57-59fa-4e5c-a28c-16f6604f30b4" path="/var/lib/kubelet/pods/ea952c57-59fa-4e5c-a28c-16f6604f30b4/volumes" Mar 07 04:41:54 crc kubenswrapper[4689]: I0307 04:41:54.638054 4689 generic.go:334] "Generic (PLEG): container finished" podID="d2e0245c-2824-46a4-8c46-e5dabccff5e5" containerID="7d55038de70538245d5fde3aa9812ce8fabefd8263074eca56f8e3c1112ca79e" exitCode=0 Mar 07 04:41:54 crc kubenswrapper[4689]: I0307 04:41:54.638225 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" event={"ID":"d2e0245c-2824-46a4-8c46-e5dabccff5e5","Type":"ContainerDied","Data":"7d55038de70538245d5fde3aa9812ce8fabefd8263074eca56f8e3c1112ca79e"} Mar 07 04:41:54 crc kubenswrapper[4689]: I0307 04:41:54.638273 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" event={"ID":"d2e0245c-2824-46a4-8c46-e5dabccff5e5","Type":"ContainerStarted","Data":"26e25eb423c312c842c422d11fa938bdeb45ca6fd8533b4f952a5a939b922a0b"} Mar 07 04:41:54 crc kubenswrapper[4689]: I0307 04:41:54.643118 4689 generic.go:334] "Generic (PLEG): container finished" podID="badeb223-8f16-446f-8b09-ee99b5fc8ee7" containerID="bb343f2f9166b9bc4959d314ca3264486f218ba0a85de9b3452e5d85f221c45f" exitCode=0 Mar 07 04:41:54 crc kubenswrapper[4689]: I0307 04:41:54.643220 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-s6mwz" event={"ID":"badeb223-8f16-446f-8b09-ee99b5fc8ee7","Type":"ContainerDied","Data":"bb343f2f9166b9bc4959d314ca3264486f218ba0a85de9b3452e5d85f221c45f"} Mar 07 04:41:54 crc kubenswrapper[4689]: I0307 04:41:54.643265 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-s6mwz" event={"ID":"badeb223-8f16-446f-8b09-ee99b5fc8ee7","Type":"ContainerStarted","Data":"2235133d0df3e4314be2b8fda56f5ae7ff92009c3f0840f35e4eebd1770f98ff"} Mar 07 04:41:55 crc kubenswrapper[4689]: I0307 04:41:55.975152 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-s6mwz" Mar 07 04:41:55 crc kubenswrapper[4689]: I0307 04:41:55.979713 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.157051 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/badeb223-8f16-446f-8b09-ee99b5fc8ee7-operator-scripts\") pod \"badeb223-8f16-446f-8b09-ee99b5fc8ee7\" (UID: \"badeb223-8f16-446f-8b09-ee99b5fc8ee7\") " Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.157241 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2e0245c-2824-46a4-8c46-e5dabccff5e5-operator-scripts\") pod \"d2e0245c-2824-46a4-8c46-e5dabccff5e5\" (UID: \"d2e0245c-2824-46a4-8c46-e5dabccff5e5\") " Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.157260 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmzz5\" (UniqueName: \"kubernetes.io/projected/badeb223-8f16-446f-8b09-ee99b5fc8ee7-kube-api-access-pmzz5\") pod \"badeb223-8f16-446f-8b09-ee99b5fc8ee7\" (UID: \"badeb223-8f16-446f-8b09-ee99b5fc8ee7\") " Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.157316 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpxdx\" (UniqueName: \"kubernetes.io/projected/d2e0245c-2824-46a4-8c46-e5dabccff5e5-kube-api-access-jpxdx\") pod \"d2e0245c-2824-46a4-8c46-e5dabccff5e5\" (UID: \"d2e0245c-2824-46a4-8c46-e5dabccff5e5\") " Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.157622 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2e0245c-2824-46a4-8c46-e5dabccff5e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2e0245c-2824-46a4-8c46-e5dabccff5e5" (UID: "d2e0245c-2824-46a4-8c46-e5dabccff5e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.157754 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/badeb223-8f16-446f-8b09-ee99b5fc8ee7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "badeb223-8f16-446f-8b09-ee99b5fc8ee7" (UID: "badeb223-8f16-446f-8b09-ee99b5fc8ee7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.162363 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/badeb223-8f16-446f-8b09-ee99b5fc8ee7-kube-api-access-pmzz5" (OuterVolumeSpecName: "kube-api-access-pmzz5") pod "badeb223-8f16-446f-8b09-ee99b5fc8ee7" (UID: "badeb223-8f16-446f-8b09-ee99b5fc8ee7"). InnerVolumeSpecName "kube-api-access-pmzz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.172906 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2e0245c-2824-46a4-8c46-e5dabccff5e5-kube-api-access-jpxdx" (OuterVolumeSpecName: "kube-api-access-jpxdx") pod "d2e0245c-2824-46a4-8c46-e5dabccff5e5" (UID: "d2e0245c-2824-46a4-8c46-e5dabccff5e5"). InnerVolumeSpecName "kube-api-access-jpxdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.258574 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2e0245c-2824-46a4-8c46-e5dabccff5e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.258628 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmzz5\" (UniqueName: \"kubernetes.io/projected/badeb223-8f16-446f-8b09-ee99b5fc8ee7-kube-api-access-pmzz5\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.258652 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpxdx\" (UniqueName: \"kubernetes.io/projected/d2e0245c-2824-46a4-8c46-e5dabccff5e5-kube-api-access-jpxdx\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.258670 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/badeb223-8f16-446f-8b09-ee99b5fc8ee7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.665775 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" event={"ID":"d2e0245c-2824-46a4-8c46-e5dabccff5e5","Type":"ContainerDied","Data":"26e25eb423c312c842c422d11fa938bdeb45ca6fd8533b4f952a5a939b922a0b"} Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.666140 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26e25eb423c312c842c422d11fa938bdeb45ca6fd8533b4f952a5a939b922a0b" Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.665811 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3153-account-create-update-p4bp6" Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.668341 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-s6mwz" event={"ID":"badeb223-8f16-446f-8b09-ee99b5fc8ee7","Type":"ContainerDied","Data":"2235133d0df3e4314be2b8fda56f5ae7ff92009c3f0840f35e4eebd1770f98ff"} Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.668398 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2235133d0df3e4314be2b8fda56f5ae7ff92009c3f0840f35e4eebd1770f98ff" Mar 07 04:41:56 crc kubenswrapper[4689]: I0307 04:41:56.668410 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-s6mwz" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.034882 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-s7wqk"] Mar 07 04:41:58 crc kubenswrapper[4689]: E0307 04:41:58.035715 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2e0245c-2824-46a4-8c46-e5dabccff5e5" containerName="mariadb-account-create-update" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.035797 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2e0245c-2824-46a4-8c46-e5dabccff5e5" containerName="mariadb-account-create-update" Mar 07 04:41:58 crc kubenswrapper[4689]: E0307 04:41:58.035882 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="badeb223-8f16-446f-8b09-ee99b5fc8ee7" containerName="mariadb-database-create" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.035939 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="badeb223-8f16-446f-8b09-ee99b5fc8ee7" containerName="mariadb-database-create" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.036108 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="badeb223-8f16-446f-8b09-ee99b5fc8ee7" containerName="mariadb-database-create" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.036186 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2e0245c-2824-46a4-8c46-e5dabccff5e5" containerName="mariadb-account-create-update" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.036738 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.039055 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-khbpz" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.046927 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.047277 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-s7wqk"] Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.187905 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68xv6\" (UniqueName: \"kubernetes.io/projected/7d31bed1-9172-4a7a-a066-8d80352d60d5-kube-api-access-68xv6\") pod \"glance-db-sync-s7wqk\" (UID: \"7d31bed1-9172-4a7a-a066-8d80352d60d5\") " pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.188204 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d31bed1-9172-4a7a-a066-8d80352d60d5-config-data\") pod \"glance-db-sync-s7wqk\" (UID: \"7d31bed1-9172-4a7a-a066-8d80352d60d5\") " pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.188423 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d31bed1-9172-4a7a-a066-8d80352d60d5-db-sync-config-data\") pod \"glance-db-sync-s7wqk\" (UID: \"7d31bed1-9172-4a7a-a066-8d80352d60d5\") " pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.290119 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d31bed1-9172-4a7a-a066-8d80352d60d5-db-sync-config-data\") pod \"glance-db-sync-s7wqk\" (UID: \"7d31bed1-9172-4a7a-a066-8d80352d60d5\") " pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.290389 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68xv6\" (UniqueName: \"kubernetes.io/projected/7d31bed1-9172-4a7a-a066-8d80352d60d5-kube-api-access-68xv6\") pod \"glance-db-sync-s7wqk\" (UID: \"7d31bed1-9172-4a7a-a066-8d80352d60d5\") " pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.290436 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d31bed1-9172-4a7a-a066-8d80352d60d5-config-data\") pod \"glance-db-sync-s7wqk\" (UID: \"7d31bed1-9172-4a7a-a066-8d80352d60d5\") " pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.309154 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d31bed1-9172-4a7a-a066-8d80352d60d5-config-data\") pod \"glance-db-sync-s7wqk\" (UID: \"7d31bed1-9172-4a7a-a066-8d80352d60d5\") " pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.310663 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d31bed1-9172-4a7a-a066-8d80352d60d5-db-sync-config-data\") pod \"glance-db-sync-s7wqk\" (UID: \"7d31bed1-9172-4a7a-a066-8d80352d60d5\") " pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.311953 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68xv6\" (UniqueName: \"kubernetes.io/projected/7d31bed1-9172-4a7a-a066-8d80352d60d5-kube-api-access-68xv6\") pod \"glance-db-sync-s7wqk\" (UID: \"7d31bed1-9172-4a7a-a066-8d80352d60d5\") " pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.367984 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:41:58 crc kubenswrapper[4689]: I0307 04:41:58.754817 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-s7wqk"] Mar 07 04:41:58 crc kubenswrapper[4689]: W0307 04:41:58.765454 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d31bed1_9172_4a7a_a066_8d80352d60d5.slice/crio-432aa9aad7a35957e6066022ec06b0a8a247149646f490867f4cd0ddb4202f31 WatchSource:0}: Error finding container 432aa9aad7a35957e6066022ec06b0a8a247149646f490867f4cd0ddb4202f31: Status 404 returned error can't find the container with id 432aa9aad7a35957e6066022ec06b0a8a247149646f490867f4cd0ddb4202f31 Mar 07 04:41:59 crc kubenswrapper[4689]: I0307 04:41:59.189513 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:41:59 crc kubenswrapper[4689]: I0307 04:41:59.189568 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:41:59 crc kubenswrapper[4689]: I0307 04:41:59.692626 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-s7wqk" event={"ID":"7d31bed1-9172-4a7a-a066-8d80352d60d5","Type":"ContainerStarted","Data":"6618bca92086fa9df2e83b3682a84d1f46d3dbb360d76a9572497e866fc42d5a"} Mar 07 04:41:59 crc kubenswrapper[4689]: I0307 04:41:59.692668 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-s7wqk" event={"ID":"7d31bed1-9172-4a7a-a066-8d80352d60d5","Type":"ContainerStarted","Data":"432aa9aad7a35957e6066022ec06b0a8a247149646f490867f4cd0ddb4202f31"} Mar 07 04:41:59 crc kubenswrapper[4689]: I0307 04:41:59.736638 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-s7wqk" podStartSLOduration=1.736615612 podStartE2EDuration="1.736615612s" podCreationTimestamp="2026-03-07 04:41:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:41:59.72688542 +0000 UTC m=+1364.773268919" watchObservedRunningTime="2026-03-07 04:41:59.736615612 +0000 UTC m=+1364.782999101" Mar 07 04:42:00 crc kubenswrapper[4689]: I0307 04:42:00.130588 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547642-bpjt8"] Mar 07 04:42:00 crc kubenswrapper[4689]: I0307 04:42:00.131632 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547642-bpjt8" Mar 07 04:42:00 crc kubenswrapper[4689]: I0307 04:42:00.135493 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:42:00 crc kubenswrapper[4689]: I0307 04:42:00.135908 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:42:00 crc kubenswrapper[4689]: I0307 04:42:00.143238 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:42:00 crc kubenswrapper[4689]: I0307 04:42:00.143928 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547642-bpjt8"] Mar 07 04:42:00 crc kubenswrapper[4689]: I0307 04:42:00.219345 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5x8w\" (UniqueName: \"kubernetes.io/projected/4221dd55-156a-45e5-8de8-e5820ecb5f10-kube-api-access-v5x8w\") pod \"auto-csr-approver-29547642-bpjt8\" (UID: \"4221dd55-156a-45e5-8de8-e5820ecb5f10\") " pod="openshift-infra/auto-csr-approver-29547642-bpjt8" Mar 07 04:42:00 crc kubenswrapper[4689]: I0307 04:42:00.321265 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5x8w\" (UniqueName: \"kubernetes.io/projected/4221dd55-156a-45e5-8de8-e5820ecb5f10-kube-api-access-v5x8w\") pod \"auto-csr-approver-29547642-bpjt8\" (UID: \"4221dd55-156a-45e5-8de8-e5820ecb5f10\") " pod="openshift-infra/auto-csr-approver-29547642-bpjt8" Mar 07 04:42:00 crc kubenswrapper[4689]: I0307 04:42:00.340023 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5x8w\" (UniqueName: \"kubernetes.io/projected/4221dd55-156a-45e5-8de8-e5820ecb5f10-kube-api-access-v5x8w\") pod \"auto-csr-approver-29547642-bpjt8\" (UID: \"4221dd55-156a-45e5-8de8-e5820ecb5f10\") " pod="openshift-infra/auto-csr-approver-29547642-bpjt8" Mar 07 04:42:00 crc kubenswrapper[4689]: I0307 04:42:00.453326 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547642-bpjt8" Mar 07 04:42:00 crc kubenswrapper[4689]: I0307 04:42:00.976223 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547642-bpjt8"] Mar 07 04:42:00 crc kubenswrapper[4689]: I0307 04:42:00.979472 4689 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 04:42:01 crc kubenswrapper[4689]: I0307 04:42:01.715719 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547642-bpjt8" event={"ID":"4221dd55-156a-45e5-8de8-e5820ecb5f10","Type":"ContainerStarted","Data":"b6dc4d316a21d70a760841c21f7d183ec291c8c4a4db8e73c7b8f6272e79612f"} Mar 07 04:42:02 crc kubenswrapper[4689]: I0307 04:42:02.728086 4689 generic.go:334] "Generic (PLEG): container finished" podID="7d31bed1-9172-4a7a-a066-8d80352d60d5" containerID="6618bca92086fa9df2e83b3682a84d1f46d3dbb360d76a9572497e866fc42d5a" exitCode=0 Mar 07 04:42:02 crc kubenswrapper[4689]: I0307 04:42:02.728505 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-s7wqk" event={"ID":"7d31bed1-9172-4a7a-a066-8d80352d60d5","Type":"ContainerDied","Data":"6618bca92086fa9df2e83b3682a84d1f46d3dbb360d76a9572497e866fc42d5a"} Mar 07 04:42:02 crc kubenswrapper[4689]: I0307 04:42:02.732920 4689 generic.go:334] "Generic (PLEG): container finished" podID="4221dd55-156a-45e5-8de8-e5820ecb5f10" containerID="b28d5dd8095fbae7ccc0311ee6792a744df56ab770cb0df72557db94cb219e45" exitCode=0 Mar 07 04:42:02 crc kubenswrapper[4689]: I0307 04:42:02.732995 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547642-bpjt8" event={"ID":"4221dd55-156a-45e5-8de8-e5820ecb5f10","Type":"ContainerDied","Data":"b28d5dd8095fbae7ccc0311ee6792a744df56ab770cb0df72557db94cb219e45"} Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.162919 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547642-bpjt8" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.175681 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.283805 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5x8w\" (UniqueName: \"kubernetes.io/projected/4221dd55-156a-45e5-8de8-e5820ecb5f10-kube-api-access-v5x8w\") pod \"4221dd55-156a-45e5-8de8-e5820ecb5f10\" (UID: \"4221dd55-156a-45e5-8de8-e5820ecb5f10\") " Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.283888 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d31bed1-9172-4a7a-a066-8d80352d60d5-config-data\") pod \"7d31bed1-9172-4a7a-a066-8d80352d60d5\" (UID: \"7d31bed1-9172-4a7a-a066-8d80352d60d5\") " Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.283915 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68xv6\" (UniqueName: \"kubernetes.io/projected/7d31bed1-9172-4a7a-a066-8d80352d60d5-kube-api-access-68xv6\") pod \"7d31bed1-9172-4a7a-a066-8d80352d60d5\" (UID: \"7d31bed1-9172-4a7a-a066-8d80352d60d5\") " Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.283963 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d31bed1-9172-4a7a-a066-8d80352d60d5-db-sync-config-data\") pod \"7d31bed1-9172-4a7a-a066-8d80352d60d5\" (UID: \"7d31bed1-9172-4a7a-a066-8d80352d60d5\") " Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.289791 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d31bed1-9172-4a7a-a066-8d80352d60d5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7d31bed1-9172-4a7a-a066-8d80352d60d5" (UID: "7d31bed1-9172-4a7a-a066-8d80352d60d5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.290967 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d31bed1-9172-4a7a-a066-8d80352d60d5-kube-api-access-68xv6" (OuterVolumeSpecName: "kube-api-access-68xv6") pod "7d31bed1-9172-4a7a-a066-8d80352d60d5" (UID: "7d31bed1-9172-4a7a-a066-8d80352d60d5"). InnerVolumeSpecName "kube-api-access-68xv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.291396 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4221dd55-156a-45e5-8de8-e5820ecb5f10-kube-api-access-v5x8w" (OuterVolumeSpecName: "kube-api-access-v5x8w") pod "4221dd55-156a-45e5-8de8-e5820ecb5f10" (UID: "4221dd55-156a-45e5-8de8-e5820ecb5f10"). InnerVolumeSpecName "kube-api-access-v5x8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.318671 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d31bed1-9172-4a7a-a066-8d80352d60d5-config-data" (OuterVolumeSpecName: "config-data") pod "7d31bed1-9172-4a7a-a066-8d80352d60d5" (UID: "7d31bed1-9172-4a7a-a066-8d80352d60d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.385271 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5x8w\" (UniqueName: \"kubernetes.io/projected/4221dd55-156a-45e5-8de8-e5820ecb5f10-kube-api-access-v5x8w\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.385309 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d31bed1-9172-4a7a-a066-8d80352d60d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.385323 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68xv6\" (UniqueName: \"kubernetes.io/projected/7d31bed1-9172-4a7a-a066-8d80352d60d5-kube-api-access-68xv6\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.385335 4689 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d31bed1-9172-4a7a-a066-8d80352d60d5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.755138 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-s7wqk" event={"ID":"7d31bed1-9172-4a7a-a066-8d80352d60d5","Type":"ContainerDied","Data":"432aa9aad7a35957e6066022ec06b0a8a247149646f490867f4cd0ddb4202f31"} Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.755222 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="432aa9aad7a35957e6066022ec06b0a8a247149646f490867f4cd0ddb4202f31" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.755537 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-s7wqk" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.757269 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547642-bpjt8" event={"ID":"4221dd55-156a-45e5-8de8-e5820ecb5f10","Type":"ContainerDied","Data":"b6dc4d316a21d70a760841c21f7d183ec291c8c4a4db8e73c7b8f6272e79612f"} Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.757311 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6dc4d316a21d70a760841c21f7d183ec291c8c4a4db8e73c7b8f6272e79612f" Mar 07 04:42:04 crc kubenswrapper[4689]: I0307 04:42:04.757370 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547642-bpjt8" Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.235385 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547636-h7t9g"] Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.243826 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547636-h7t9g"] Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.837032 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd546d8b-6b91-4fbe-91a0-b16532fc2759" path="/var/lib/kubelet/pods/cd546d8b-6b91-4fbe-91a0-b16532fc2759/volumes" Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.937745 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:42:05 crc kubenswrapper[4689]: E0307 04:42:05.938042 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d31bed1-9172-4a7a-a066-8d80352d60d5" containerName="glance-db-sync" Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.938060 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d31bed1-9172-4a7a-a066-8d80352d60d5" containerName="glance-db-sync" Mar 07 04:42:05 crc kubenswrapper[4689]: E0307 04:42:05.938070 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4221dd55-156a-45e5-8de8-e5820ecb5f10" containerName="oc" Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.938077 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="4221dd55-156a-45e5-8de8-e5820ecb5f10" containerName="oc" Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.938230 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="4221dd55-156a-45e5-8de8-e5820ecb5f10" containerName="oc" Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.938253 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d31bed1-9172-4a7a-a066-8d80352d60d5" containerName="glance-db-sync" Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.938914 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.942238 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.942553 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-khbpz" Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.942677 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Mar 07 04:42:05 crc kubenswrapper[4689]: I0307 04:42:05.961588 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.011785 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-run\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.011858 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.011893 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-lib-modules\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.011952 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.011974 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-sys\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.011996 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpzft\" (UniqueName: \"kubernetes.io/projected/2539324b-8bd8-4820-8b6a-bc1cc184152e-kube-api-access-jpzft\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.012034 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.012062 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2539324b-8bd8-4820-8b6a-bc1cc184152e-logs\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.012087 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2539324b-8bd8-4820-8b6a-bc1cc184152e-scripts\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.012109 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.012131 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2539324b-8bd8-4820-8b6a-bc1cc184152e-httpd-run\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.012154 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-dev\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.012210 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-etc-nvme\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.012246 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2539324b-8bd8-4820-8b6a-bc1cc184152e-config-data\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113645 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-sys\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113679 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpzft\" (UniqueName: \"kubernetes.io/projected/2539324b-8bd8-4820-8b6a-bc1cc184152e-kube-api-access-jpzft\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113709 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113727 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2539324b-8bd8-4820-8b6a-bc1cc184152e-logs\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113751 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2539324b-8bd8-4820-8b6a-bc1cc184152e-scripts\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113770 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2539324b-8bd8-4820-8b6a-bc1cc184152e-httpd-run\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113782 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113806 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-dev\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113826 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-etc-nvme\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113848 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2539324b-8bd8-4820-8b6a-bc1cc184152e-config-data\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113872 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-run\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113899 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113919 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-lib-modules\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.113953 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.114031 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-sys\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.114075 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-dev\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.114122 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.114202 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.114551 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2539324b-8bd8-4820-8b6a-bc1cc184152e-logs\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.115038 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2539324b-8bd8-4820-8b6a-bc1cc184152e-httpd-run\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.115099 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.115125 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-run\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.115157 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-etc-nvme\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.117699 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.120266 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-lib-modules\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.123866 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2539324b-8bd8-4820-8b6a-bc1cc184152e-scripts\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.125592 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2539324b-8bd8-4820-8b6a-bc1cc184152e-config-data\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.139665 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.143917 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpzft\" (UniqueName: \"kubernetes.io/projected/2539324b-8bd8-4820-8b6a-bc1cc184152e-kube-api-access-jpzft\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.153587 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.256577 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.675157 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:42:06 crc kubenswrapper[4689]: I0307 04:42:06.780257 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"2539324b-8bd8-4820-8b6a-bc1cc184152e","Type":"ContainerStarted","Data":"0f85128b8d4460e08e9d64914818daba1fb46bd8087f1f4f960522988998b4ee"} Mar 07 04:42:07 crc kubenswrapper[4689]: I0307 04:42:07.791626 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"2539324b-8bd8-4820-8b6a-bc1cc184152e","Type":"ContainerStarted","Data":"dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61"} Mar 07 04:42:07 crc kubenswrapper[4689]: I0307 04:42:07.793800 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"2539324b-8bd8-4820-8b6a-bc1cc184152e","Type":"ContainerStarted","Data":"9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6"} Mar 07 04:42:07 crc kubenswrapper[4689]: I0307 04:42:07.822486 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.8224670830000003 podStartE2EDuration="2.822467083s" podCreationTimestamp="2026-03-07 04:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:42:07.816482862 +0000 UTC m=+1372.862866351" watchObservedRunningTime="2026-03-07 04:42:07.822467083 +0000 UTC m=+1372.868850572" Mar 07 04:42:16 crc kubenswrapper[4689]: I0307 04:42:16.256918 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:16 crc kubenswrapper[4689]: I0307 04:42:16.257483 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:16 crc kubenswrapper[4689]: I0307 04:42:16.287104 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:16 crc kubenswrapper[4689]: I0307 04:42:16.323858 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:16 crc kubenswrapper[4689]: I0307 04:42:16.879915 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:16 crc kubenswrapper[4689]: I0307 04:42:16.879986 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:18 crc kubenswrapper[4689]: I0307 04:42:18.835230 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:18 crc kubenswrapper[4689]: I0307 04:42:18.872394 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.058724 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.060998 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.064647 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.065874 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.077858 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.103405 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.189844 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a8105-63e3-463c-8d7c-eaa06b97f617-config-data\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.189897 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-lib-modules\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.189923 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190017 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-config-data\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190042 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-run\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190073 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw9dt\" (UniqueName: \"kubernetes.io/projected/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-kube-api-access-sw9dt\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190113 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190208 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190310 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190339 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-scripts\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190362 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-etc-nvme\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190434 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-logs\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190470 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a8105-63e3-463c-8d7c-eaa06b97f617-scripts\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190502 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-sys\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190535 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-dev\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190579 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-run\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190627 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d3a8105-63e3-463c-8d7c-eaa06b97f617-httpd-run\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190670 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190727 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d3a8105-63e3-463c-8d7c-eaa06b97f617-logs\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190744 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190814 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190848 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-httpd-run\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190880 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-lib-modules\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190922 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zckhz\" (UniqueName: \"kubernetes.io/projected/3d3a8105-63e3-463c-8d7c-eaa06b97f617-kube-api-access-zckhz\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.190957 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-etc-nvme\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.191059 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-sys\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.191166 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-dev\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.191207 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.292604 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.292673 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.292712 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.292747 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-scripts\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.292762 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.292875 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-etc-nvme\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.292848 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.292784 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-etc-nvme\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.292956 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-logs\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.292994 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a8105-63e3-463c-8d7c-eaa06b97f617-scripts\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293026 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-sys\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293060 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-dev\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293083 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293113 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-sys\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293095 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-run\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293138 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-dev\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293205 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-run\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293216 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d3a8105-63e3-463c-8d7c-eaa06b97f617-httpd-run\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293256 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293286 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d3a8105-63e3-463c-8d7c-eaa06b97f617-logs\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293336 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293383 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293415 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-httpd-run\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293447 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-lib-modules\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293487 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zckhz\" (UniqueName: \"kubernetes.io/projected/3d3a8105-63e3-463c-8d7c-eaa06b97f617-kube-api-access-zckhz\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293519 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-etc-nvme\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293556 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-sys\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293612 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-dev\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293643 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293679 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a8105-63e3-463c-8d7c-eaa06b97f617-config-data\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293699 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293712 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-lib-modules\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293739 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-lib-modules\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293744 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-sys\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293771 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-etc-nvme\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293781 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293715 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d3a8105-63e3-463c-8d7c-eaa06b97f617-httpd-run\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293808 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-dev\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293813 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-lib-modules\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293744 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293838 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.293889 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.294010 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.294400 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-logs\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.294487 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-config-data\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.294511 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d3a8105-63e3-463c-8d7c-eaa06b97f617-logs\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.294539 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-run\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.294585 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw9dt\" (UniqueName: \"kubernetes.io/projected/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-kube-api-access-sw9dt\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.294923 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-run\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.301416 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-httpd-run\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.303549 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-scripts\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.303587 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a8105-63e3-463c-8d7c-eaa06b97f617-scripts\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.305853 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a8105-63e3-463c-8d7c-eaa06b97f617-config-data\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.309058 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-config-data\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.318243 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.322378 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.328550 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zckhz\" (UniqueName: \"kubernetes.io/projected/3d3a8105-63e3-463c-8d7c-eaa06b97f617-kube-api-access-zckhz\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.329995 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw9dt\" (UniqueName: \"kubernetes.io/projected/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-kube-api-access-sw9dt\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.337157 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-2\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.343882 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.384034 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.397905 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.665213 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:42:22 crc kubenswrapper[4689]: W0307 04:42:22.673797 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d3a8105_63e3_463c_8d7c_eaa06b97f617.slice/crio-29189945f43b763fa3e646de1456d32826c3b74859399e967bdaf542b488fefc WatchSource:0}: Error finding container 29189945f43b763fa3e646de1456d32826c3b74859399e967bdaf542b488fefc: Status 404 returned error can't find the container with id 29189945f43b763fa3e646de1456d32826c3b74859399e967bdaf542b488fefc Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.710543 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Mar 07 04:42:22 crc kubenswrapper[4689]: W0307 04:42:22.714706 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2572d8e_c88c_44a1_bdba_27c7342dd0ac.slice/crio-06dd865409ed30f7bf62d205bf05b04f63c50d3da3f843e758fb82fb4eabbdf2 WatchSource:0}: Error finding container 06dd865409ed30f7bf62d205bf05b04f63c50d3da3f843e758fb82fb4eabbdf2: Status 404 returned error can't find the container with id 06dd865409ed30f7bf62d205bf05b04f63c50d3da3f843e758fb82fb4eabbdf2 Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.935401 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"a2572d8e-c88c-44a1-bdba-27c7342dd0ac","Type":"ContainerStarted","Data":"8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781"} Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.935796 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"a2572d8e-c88c-44a1-bdba-27c7342dd0ac","Type":"ContainerStarted","Data":"06dd865409ed30f7bf62d205bf05b04f63c50d3da3f843e758fb82fb4eabbdf2"} Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.941239 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"3d3a8105-63e3-463c-8d7c-eaa06b97f617","Type":"ContainerStarted","Data":"5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a"} Mar 07 04:42:22 crc kubenswrapper[4689]: I0307 04:42:22.941289 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"3d3a8105-63e3-463c-8d7c-eaa06b97f617","Type":"ContainerStarted","Data":"29189945f43b763fa3e646de1456d32826c3b74859399e967bdaf542b488fefc"} Mar 07 04:42:23 crc kubenswrapper[4689]: I0307 04:42:23.972306 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"3d3a8105-63e3-463c-8d7c-eaa06b97f617","Type":"ContainerStarted","Data":"0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa"} Mar 07 04:42:23 crc kubenswrapper[4689]: I0307 04:42:23.974069 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"a2572d8e-c88c-44a1-bdba-27c7342dd0ac","Type":"ContainerStarted","Data":"dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f"} Mar 07 04:42:24 crc kubenswrapper[4689]: I0307 04:42:24.005888 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=3.005869395 podStartE2EDuration="3.005869395s" podCreationTimestamp="2026-03-07 04:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:42:24.000463269 +0000 UTC m=+1389.046846768" watchObservedRunningTime="2026-03-07 04:42:24.005869395 +0000 UTC m=+1389.052252894" Mar 07 04:42:24 crc kubenswrapper[4689]: I0307 04:42:24.025561 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-2" podStartSLOduration=3.025540156 podStartE2EDuration="3.025540156s" podCreationTimestamp="2026-03-07 04:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:42:24.01943019 +0000 UTC m=+1389.065813689" watchObservedRunningTime="2026-03-07 04:42:24.025540156 +0000 UTC m=+1389.071923645" Mar 07 04:42:29 crc kubenswrapper[4689]: I0307 04:42:29.189553 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:42:29 crc kubenswrapper[4689]: I0307 04:42:29.190233 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:42:29 crc kubenswrapper[4689]: I0307 04:42:29.190296 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:42:29 crc kubenswrapper[4689]: I0307 04:42:29.191136 4689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d7f7f5d4bedb9f0999f9f7b5b22121b12b61459642fd73d8cbc908ec8691b15"} pod="openshift-machine-config-operator/machine-config-daemon-dss5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 04:42:29 crc kubenswrapper[4689]: I0307 04:42:29.191253 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" containerID="cri-o://1d7f7f5d4bedb9f0999f9f7b5b22121b12b61459642fd73d8cbc908ec8691b15" gracePeriod=600 Mar 07 04:42:30 crc kubenswrapper[4689]: I0307 04:42:30.035511 4689 generic.go:334] "Generic (PLEG): container finished" podID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerID="1d7f7f5d4bedb9f0999f9f7b5b22121b12b61459642fd73d8cbc908ec8691b15" exitCode=0 Mar 07 04:42:30 crc kubenswrapper[4689]: I0307 04:42:30.035753 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerDied","Data":"1d7f7f5d4bedb9f0999f9f7b5b22121b12b61459642fd73d8cbc908ec8691b15"} Mar 07 04:42:30 crc kubenswrapper[4689]: I0307 04:42:30.036029 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerStarted","Data":"84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536"} Mar 07 04:42:30 crc kubenswrapper[4689]: I0307 04:42:30.036054 4689 scope.go:117] "RemoveContainer" containerID="ae730408636ba5641da1384c81b782848c445de37ccd29b97d13a35866436afe" Mar 07 04:42:32 crc kubenswrapper[4689]: I0307 04:42:32.385018 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:32 crc kubenswrapper[4689]: I0307 04:42:32.385448 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:32 crc kubenswrapper[4689]: I0307 04:42:32.399088 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:32 crc kubenswrapper[4689]: I0307 04:42:32.399142 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:32 crc kubenswrapper[4689]: I0307 04:42:32.432102 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:32 crc kubenswrapper[4689]: I0307 04:42:32.433989 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:32 crc kubenswrapper[4689]: I0307 04:42:32.458287 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:32 crc kubenswrapper[4689]: I0307 04:42:32.484279 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:33 crc kubenswrapper[4689]: I0307 04:42:33.062422 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:33 crc kubenswrapper[4689]: I0307 04:42:33.062461 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:33 crc kubenswrapper[4689]: I0307 04:42:33.062473 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:33 crc kubenswrapper[4689]: I0307 04:42:33.062481 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:34 crc kubenswrapper[4689]: I0307 04:42:34.938784 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:35 crc kubenswrapper[4689]: I0307 04:42:35.055160 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:35 crc kubenswrapper[4689]: I0307 04:42:35.057292 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:35 crc kubenswrapper[4689]: I0307 04:42:35.064713 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:35 crc kubenswrapper[4689]: I0307 04:42:35.506057 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Mar 07 04:42:35 crc kubenswrapper[4689]: I0307 04:42:35.511570 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:42:37 crc kubenswrapper[4689]: I0307 04:42:37.096401 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="a2572d8e-c88c-44a1-bdba-27c7342dd0ac" containerName="glance-log" containerID="cri-o://8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781" gracePeriod=30 Mar 07 04:42:37 crc kubenswrapper[4689]: I0307 04:42:37.096665 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="a2572d8e-c88c-44a1-bdba-27c7342dd0ac" containerName="glance-httpd" containerID="cri-o://dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f" gracePeriod=30 Mar 07 04:42:37 crc kubenswrapper[4689]: I0307 04:42:37.096810 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="3d3a8105-63e3-463c-8d7c-eaa06b97f617" containerName="glance-log" containerID="cri-o://5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a" gracePeriod=30 Mar 07 04:42:37 crc kubenswrapper[4689]: I0307 04:42:37.096887 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="3d3a8105-63e3-463c-8d7c-eaa06b97f617" containerName="glance-httpd" containerID="cri-o://0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa" gracePeriod=30 Mar 07 04:42:38 crc kubenswrapper[4689]: I0307 04:42:38.106888 4689 generic.go:334] "Generic (PLEG): container finished" podID="3d3a8105-63e3-463c-8d7c-eaa06b97f617" containerID="5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a" exitCode=143 Mar 07 04:42:38 crc kubenswrapper[4689]: I0307 04:42:38.106993 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"3d3a8105-63e3-463c-8d7c-eaa06b97f617","Type":"ContainerDied","Data":"5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a"} Mar 07 04:42:38 crc kubenswrapper[4689]: I0307 04:42:38.109653 4689 generic.go:334] "Generic (PLEG): container finished" podID="a2572d8e-c88c-44a1-bdba-27c7342dd0ac" containerID="8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781" exitCode=143 Mar 07 04:42:38 crc kubenswrapper[4689]: I0307 04:42:38.109752 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"a2572d8e-c88c-44a1-bdba-27c7342dd0ac","Type":"ContainerDied","Data":"8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781"} Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.764045 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.767892 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794500 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-var-locks-brick\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794553 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-etc-iscsi\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794582 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-scripts\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794601 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-lib-modules\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794626 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794647 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-httpd-run\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794668 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-run\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794704 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a8105-63e3-463c-8d7c-eaa06b97f617-config-data\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794736 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-dev\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794751 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d3a8105-63e3-463c-8d7c-eaa06b97f617-httpd-run\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794789 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zckhz\" (UniqueName: \"kubernetes.io/projected/3d3a8105-63e3-463c-8d7c-eaa06b97f617-kube-api-access-zckhz\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794805 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-etc-iscsi\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794823 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-etc-nvme\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794842 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-logs\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794862 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d3a8105-63e3-463c-8d7c-eaa06b97f617-logs\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794879 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-sys\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794893 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-lib-modules\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794913 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-run\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794928 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794943 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-config-data\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794956 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-sys\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794976 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-etc-nvme\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.794996 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.795013 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.795032 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw9dt\" (UniqueName: \"kubernetes.io/projected/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-kube-api-access-sw9dt\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.795049 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-var-locks-brick\") pod \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\" (UID: \"a2572d8e-c88c-44a1-bdba-27c7342dd0ac\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.795076 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-dev\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.795101 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a8105-63e3-463c-8d7c-eaa06b97f617-scripts\") pod \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\" (UID: \"3d3a8105-63e3-463c-8d7c-eaa06b97f617\") " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.796322 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-sys" (OuterVolumeSpecName: "sys") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.796771 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-run" (OuterVolumeSpecName: "run") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.796772 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d3a8105-63e3-463c-8d7c-eaa06b97f617-logs" (OuterVolumeSpecName: "logs") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.796822 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798285 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-logs" (OuterVolumeSpecName: "logs") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798388 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-dev" (OuterVolumeSpecName: "dev") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798393 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798523 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798552 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-sys" (OuterVolumeSpecName: "sys") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798625 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798662 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-run" (OuterVolumeSpecName: "run") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798704 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d3a8105-63e3-463c-8d7c-eaa06b97f617-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798741 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798910 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-dev" (OuterVolumeSpecName: "dev") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798921 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798950 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798959 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.798990 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.803640 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a8105-63e3-463c-8d7c-eaa06b97f617-scripts" (OuterVolumeSpecName: "scripts") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.803774 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance-cache") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.814467 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.814560 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance-cache") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.816189 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3a8105-63e3-463c-8d7c-eaa06b97f617-kube-api-access-zckhz" (OuterVolumeSpecName: "kube-api-access-zckhz") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "kube-api-access-zckhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.817134 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.818316 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-kube-api-access-sw9dt" (OuterVolumeSpecName: "kube-api-access-sw9dt") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "kube-api-access-sw9dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.818888 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-scripts" (OuterVolumeSpecName: "scripts") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.849435 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a8105-63e3-463c-8d7c-eaa06b97f617-config-data" (OuterVolumeSpecName: "config-data") pod "3d3a8105-63e3-463c-8d7c-eaa06b97f617" (UID: "3d3a8105-63e3-463c-8d7c-eaa06b97f617"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.854311 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-config-data" (OuterVolumeSpecName: "config-data") pod "a2572d8e-c88c-44a1-bdba-27c7342dd0ac" (UID: "a2572d8e-c88c-44a1-bdba-27c7342dd0ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897051 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897089 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw9dt\" (UniqueName: \"kubernetes.io/projected/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-kube-api-access-sw9dt\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897104 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897117 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897129 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a8105-63e3-463c-8d7c-eaa06b97f617-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897140 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897152 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897237 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897252 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897273 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897285 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897296 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897307 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a8105-63e3-463c-8d7c-eaa06b97f617-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897318 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897330 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d3a8105-63e3-463c-8d7c-eaa06b97f617-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897341 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zckhz\" (UniqueName: \"kubernetes.io/projected/3d3a8105-63e3-463c-8d7c-eaa06b97f617-kube-api-access-zckhz\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897354 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897365 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897380 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897391 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d3a8105-63e3-463c-8d7c-eaa06b97f617-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897403 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897414 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897622 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.897652 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.898315 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.898596 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2572d8e-c88c-44a1-bdba-27c7342dd0ac-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.898628 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d3a8105-63e3-463c-8d7c-eaa06b97f617-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.898686 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.911301 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.916429 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.917268 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.920927 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 07 04:42:40 crc kubenswrapper[4689]: I0307 04:42:40.999920 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.000306 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.000493 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.000622 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.146448 4689 generic.go:334] "Generic (PLEG): container finished" podID="3d3a8105-63e3-463c-8d7c-eaa06b97f617" containerID="0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa" exitCode=0 Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.146578 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"3d3a8105-63e3-463c-8d7c-eaa06b97f617","Type":"ContainerDied","Data":"0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa"} Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.146623 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"3d3a8105-63e3-463c-8d7c-eaa06b97f617","Type":"ContainerDied","Data":"29189945f43b763fa3e646de1456d32826c3b74859399e967bdaf542b488fefc"} Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.146650 4689 scope.go:117] "RemoveContainer" containerID="0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.147329 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.149482 4689 generic.go:334] "Generic (PLEG): container finished" podID="a2572d8e-c88c-44a1-bdba-27c7342dd0ac" containerID="dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f" exitCode=0 Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.149546 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"a2572d8e-c88c-44a1-bdba-27c7342dd0ac","Type":"ContainerDied","Data":"dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f"} Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.149591 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"a2572d8e-c88c-44a1-bdba-27c7342dd0ac","Type":"ContainerDied","Data":"06dd865409ed30f7bf62d205bf05b04f63c50d3da3f843e758fb82fb4eabbdf2"} Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.149644 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.186397 4689 scope.go:117] "RemoveContainer" containerID="5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.210265 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.221273 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.222284 4689 scope.go:117] "RemoveContainer" containerID="0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa" Mar 07 04:42:41 crc kubenswrapper[4689]: E0307 04:42:41.222748 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa\": container with ID starting with 0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa not found: ID does not exist" containerID="0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.222812 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa"} err="failed to get container status \"0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa\": rpc error: code = NotFound desc = could not find container \"0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa\": container with ID starting with 0b9ffed52ccebe04a5bc2dba0c1c667d8512a244e9ff9e1ca8d4fa6f4b9004aa not found: ID does not exist" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.222853 4689 scope.go:117] "RemoveContainer" containerID="5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a" Mar 07 04:42:41 crc kubenswrapper[4689]: E0307 04:42:41.223927 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a\": container with ID starting with 5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a not found: ID does not exist" containerID="5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.224128 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a"} err="failed to get container status \"5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a\": rpc error: code = NotFound desc = could not find container \"5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a\": container with ID starting with 5de7bc4f59a23c673528f606ad2f66777f914676ab469a83f040c0c725317e0a not found: ID does not exist" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.224324 4689 scope.go:117] "RemoveContainer" containerID="dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.238316 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.249347 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.273139 4689 scope.go:117] "RemoveContainer" containerID="8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.298378 4689 scope.go:117] "RemoveContainer" containerID="dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f" Mar 07 04:42:41 crc kubenswrapper[4689]: E0307 04:42:41.298848 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f\": container with ID starting with dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f not found: ID does not exist" containerID="dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.298895 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f"} err="failed to get container status \"dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f\": rpc error: code = NotFound desc = could not find container \"dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f\": container with ID starting with dcd29da6797ab4d1ff0bfcf1038d9aee27886cfa6b8e8939f9a735b91babf01f not found: ID does not exist" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.298930 4689 scope.go:117] "RemoveContainer" containerID="8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781" Mar 07 04:42:41 crc kubenswrapper[4689]: E0307 04:42:41.299266 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781\": container with ID starting with 8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781 not found: ID does not exist" containerID="8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.299294 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781"} err="failed to get container status \"8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781\": rpc error: code = NotFound desc = could not find container \"8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781\": container with ID starting with 8acee4227c2e5639bf452a13af4931c15d3ec44a55c91fe3b98444e4893ce781 not found: ID does not exist" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.842532 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d3a8105-63e3-463c-8d7c-eaa06b97f617" path="/var/lib/kubelet/pods/3d3a8105-63e3-463c-8d7c-eaa06b97f617/volumes" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.844083 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2572d8e-c88c-44a1-bdba-27c7342dd0ac" path="/var/lib/kubelet/pods/a2572d8e-c88c-44a1-bdba-27c7342dd0ac/volumes" Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.913399 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.913683 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="2539324b-8bd8-4820-8b6a-bc1cc184152e" containerName="glance-log" containerID="cri-o://9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6" gracePeriod=30 Mar 07 04:42:41 crc kubenswrapper[4689]: I0307 04:42:41.913804 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="2539324b-8bd8-4820-8b6a-bc1cc184152e" containerName="glance-httpd" containerID="cri-o://dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61" gracePeriod=30 Mar 07 04:42:42 crc kubenswrapper[4689]: I0307 04:42:42.158235 4689 generic.go:334] "Generic (PLEG): container finished" podID="2539324b-8bd8-4820-8b6a-bc1cc184152e" containerID="9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6" exitCode=143 Mar 07 04:42:42 crc kubenswrapper[4689]: I0307 04:42:42.158299 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"2539324b-8bd8-4820-8b6a-bc1cc184152e","Type":"ContainerDied","Data":"9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6"} Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.729498 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771250 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpzft\" (UniqueName: \"kubernetes.io/projected/2539324b-8bd8-4820-8b6a-bc1cc184152e-kube-api-access-jpzft\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771612 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-var-locks-brick\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771640 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2539324b-8bd8-4820-8b6a-bc1cc184152e-httpd-run\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771658 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-sys\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771685 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-lib-modules\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771705 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771724 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2539324b-8bd8-4820-8b6a-bc1cc184152e-logs\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771743 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-dev\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771786 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2539324b-8bd8-4820-8b6a-bc1cc184152e-scripts\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771809 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-etc-nvme\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771827 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771842 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2539324b-8bd8-4820-8b6a-bc1cc184152e-config-data\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771859 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-run\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.771876 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-etc-iscsi\") pod \"2539324b-8bd8-4820-8b6a-bc1cc184152e\" (UID: \"2539324b-8bd8-4820-8b6a-bc1cc184152e\") " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.772147 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.772213 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.772255 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-dev" (OuterVolumeSpecName: "dev") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.772604 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2539324b-8bd8-4820-8b6a-bc1cc184152e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.772651 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-sys" (OuterVolumeSpecName: "sys") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.772678 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.774970 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.775017 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-run" (OuterVolumeSpecName: "run") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.775549 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2539324b-8bd8-4820-8b6a-bc1cc184152e-logs" (OuterVolumeSpecName: "logs") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.780332 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2539324b-8bd8-4820-8b6a-bc1cc184152e-scripts" (OuterVolumeSpecName: "scripts") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.780359 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.781021 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2539324b-8bd8-4820-8b6a-bc1cc184152e-kube-api-access-jpzft" (OuterVolumeSpecName: "kube-api-access-jpzft") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "kube-api-access-jpzft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.782255 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.825026 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2539324b-8bd8-4820-8b6a-bc1cc184152e-config-data" (OuterVolumeSpecName: "config-data") pod "2539324b-8bd8-4820-8b6a-bc1cc184152e" (UID: "2539324b-8bd8-4820-8b6a-bc1cc184152e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873546 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873580 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpzft\" (UniqueName: \"kubernetes.io/projected/2539324b-8bd8-4820-8b6a-bc1cc184152e-kube-api-access-jpzft\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873596 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873608 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2539324b-8bd8-4820-8b6a-bc1cc184152e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873620 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873631 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873662 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873675 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2539324b-8bd8-4820-8b6a-bc1cc184152e-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873687 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873697 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2539324b-8bd8-4820-8b6a-bc1cc184152e-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873708 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873725 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873738 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2539324b-8bd8-4820-8b6a-bc1cc184152e-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.873750 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2539324b-8bd8-4820-8b6a-bc1cc184152e-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.889390 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.890876 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.974778 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:45 crc kubenswrapper[4689]: I0307 04:42:45.974817 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:46 crc kubenswrapper[4689]: I0307 04:42:46.198551 4689 generic.go:334] "Generic (PLEG): container finished" podID="2539324b-8bd8-4820-8b6a-bc1cc184152e" containerID="dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61" exitCode=0 Mar 07 04:42:46 crc kubenswrapper[4689]: I0307 04:42:46.198614 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"2539324b-8bd8-4820-8b6a-bc1cc184152e","Type":"ContainerDied","Data":"dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61"} Mar 07 04:42:46 crc kubenswrapper[4689]: I0307 04:42:46.198645 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"2539324b-8bd8-4820-8b6a-bc1cc184152e","Type":"ContainerDied","Data":"0f85128b8d4460e08e9d64914818daba1fb46bd8087f1f4f960522988998b4ee"} Mar 07 04:42:46 crc kubenswrapper[4689]: I0307 04:42:46.198662 4689 scope.go:117] "RemoveContainer" containerID="dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61" Mar 07 04:42:46 crc kubenswrapper[4689]: I0307 04:42:46.199162 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Mar 07 04:42:46 crc kubenswrapper[4689]: I0307 04:42:46.228535 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:42:46 crc kubenswrapper[4689]: I0307 04:42:46.228879 4689 scope.go:117] "RemoveContainer" containerID="9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6" Mar 07 04:42:46 crc kubenswrapper[4689]: I0307 04:42:46.240844 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Mar 07 04:42:46 crc kubenswrapper[4689]: I0307 04:42:46.248576 4689 scope.go:117] "RemoveContainer" containerID="dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61" Mar 07 04:42:46 crc kubenswrapper[4689]: E0307 04:42:46.249224 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61\": container with ID starting with dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61 not found: ID does not exist" containerID="dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61" Mar 07 04:42:46 crc kubenswrapper[4689]: I0307 04:42:46.249358 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61"} err="failed to get container status \"dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61\": rpc error: code = NotFound desc = could not find container \"dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61\": container with ID starting with dfb2d11ebd980bccfbe6f860ff2c19b1e96ed4a46fda8dafff1acec9255b6e61 not found: ID does not exist" Mar 07 04:42:46 crc kubenswrapper[4689]: I0307 04:42:46.249400 4689 scope.go:117] "RemoveContainer" containerID="9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6" Mar 07 04:42:46 crc kubenswrapper[4689]: E0307 04:42:46.249838 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6\": container with ID starting with 9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6 not found: ID does not exist" containerID="9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6" Mar 07 04:42:46 crc kubenswrapper[4689]: I0307 04:42:46.249875 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6"} err="failed to get container status \"9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6\": rpc error: code = NotFound desc = could not find container \"9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6\": container with ID starting with 9514b39675dab348e9df4ad54366eb11be72a26dbb6b5371e7cf66aff5930ff6 not found: ID does not exist" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.349915 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-s7wqk"] Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.362075 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-s7wqk"] Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.374921 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance3153-account-delete-m764h"] Mar 07 04:42:47 crc kubenswrapper[4689]: E0307 04:42:47.375251 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2572d8e-c88c-44a1-bdba-27c7342dd0ac" containerName="glance-httpd" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.375270 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2572d8e-c88c-44a1-bdba-27c7342dd0ac" containerName="glance-httpd" Mar 07 04:42:47 crc kubenswrapper[4689]: E0307 04:42:47.375290 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3a8105-63e3-463c-8d7c-eaa06b97f617" containerName="glance-log" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.375299 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3a8105-63e3-463c-8d7c-eaa06b97f617" containerName="glance-log" Mar 07 04:42:47 crc kubenswrapper[4689]: E0307 04:42:47.375317 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2572d8e-c88c-44a1-bdba-27c7342dd0ac" containerName="glance-log" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.375325 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2572d8e-c88c-44a1-bdba-27c7342dd0ac" containerName="glance-log" Mar 07 04:42:47 crc kubenswrapper[4689]: E0307 04:42:47.375340 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3a8105-63e3-463c-8d7c-eaa06b97f617" containerName="glance-httpd" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.375348 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3a8105-63e3-463c-8d7c-eaa06b97f617" containerName="glance-httpd" Mar 07 04:42:47 crc kubenswrapper[4689]: E0307 04:42:47.375359 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2539324b-8bd8-4820-8b6a-bc1cc184152e" containerName="glance-httpd" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.375367 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="2539324b-8bd8-4820-8b6a-bc1cc184152e" containerName="glance-httpd" Mar 07 04:42:47 crc kubenswrapper[4689]: E0307 04:42:47.375382 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2539324b-8bd8-4820-8b6a-bc1cc184152e" containerName="glance-log" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.375390 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="2539324b-8bd8-4820-8b6a-bc1cc184152e" containerName="glance-log" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.375559 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="2539324b-8bd8-4820-8b6a-bc1cc184152e" containerName="glance-log" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.375580 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3a8105-63e3-463c-8d7c-eaa06b97f617" containerName="glance-log" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.375589 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2572d8e-c88c-44a1-bdba-27c7342dd0ac" containerName="glance-httpd" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.375603 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2572d8e-c88c-44a1-bdba-27c7342dd0ac" containerName="glance-log" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.375612 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3a8105-63e3-463c-8d7c-eaa06b97f617" containerName="glance-httpd" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.375627 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="2539324b-8bd8-4820-8b6a-bc1cc184152e" containerName="glance-httpd" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.376134 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3153-account-delete-m764h" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.384662 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3153-account-delete-m764h"] Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.493937 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fm5h\" (UniqueName: \"kubernetes.io/projected/a599f088-e3d4-4307-82bb-15f128953741-kube-api-access-9fm5h\") pod \"glance3153-account-delete-m764h\" (UID: \"a599f088-e3d4-4307-82bb-15f128953741\") " pod="glance-kuttl-tests/glance3153-account-delete-m764h" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.493995 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a599f088-e3d4-4307-82bb-15f128953741-operator-scripts\") pod \"glance3153-account-delete-m764h\" (UID: \"a599f088-e3d4-4307-82bb-15f128953741\") " pod="glance-kuttl-tests/glance3153-account-delete-m764h" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.595292 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fm5h\" (UniqueName: \"kubernetes.io/projected/a599f088-e3d4-4307-82bb-15f128953741-kube-api-access-9fm5h\") pod \"glance3153-account-delete-m764h\" (UID: \"a599f088-e3d4-4307-82bb-15f128953741\") " pod="glance-kuttl-tests/glance3153-account-delete-m764h" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.595378 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a599f088-e3d4-4307-82bb-15f128953741-operator-scripts\") pod \"glance3153-account-delete-m764h\" (UID: \"a599f088-e3d4-4307-82bb-15f128953741\") " pod="glance-kuttl-tests/glance3153-account-delete-m764h" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.596868 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a599f088-e3d4-4307-82bb-15f128953741-operator-scripts\") pod \"glance3153-account-delete-m764h\" (UID: \"a599f088-e3d4-4307-82bb-15f128953741\") " pod="glance-kuttl-tests/glance3153-account-delete-m764h" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.617336 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fm5h\" (UniqueName: \"kubernetes.io/projected/a599f088-e3d4-4307-82bb-15f128953741-kube-api-access-9fm5h\") pod \"glance3153-account-delete-m764h\" (UID: \"a599f088-e3d4-4307-82bb-15f128953741\") " pod="glance-kuttl-tests/glance3153-account-delete-m764h" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.696411 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3153-account-delete-m764h" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.847290 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2539324b-8bd8-4820-8b6a-bc1cc184152e" path="/var/lib/kubelet/pods/2539324b-8bd8-4820-8b6a-bc1cc184152e/volumes" Mar 07 04:42:47 crc kubenswrapper[4689]: I0307 04:42:47.847942 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d31bed1-9172-4a7a-a066-8d80352d60d5" path="/var/lib/kubelet/pods/7d31bed1-9172-4a7a-a066-8d80352d60d5/volumes" Mar 07 04:42:48 crc kubenswrapper[4689]: I0307 04:42:48.254261 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3153-account-delete-m764h"] Mar 07 04:42:49 crc kubenswrapper[4689]: I0307 04:42:49.225602 4689 generic.go:334] "Generic (PLEG): container finished" podID="a599f088-e3d4-4307-82bb-15f128953741" containerID="2b0bc0ffbe7e65000809212717bbdfd5cff4a845cbd284041e59b727bbd42d89" exitCode=0 Mar 07 04:42:49 crc kubenswrapper[4689]: I0307 04:42:49.225662 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3153-account-delete-m764h" event={"ID":"a599f088-e3d4-4307-82bb-15f128953741","Type":"ContainerDied","Data":"2b0bc0ffbe7e65000809212717bbdfd5cff4a845cbd284041e59b727bbd42d89"} Mar 07 04:42:49 crc kubenswrapper[4689]: I0307 04:42:49.225876 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3153-account-delete-m764h" event={"ID":"a599f088-e3d4-4307-82bb-15f128953741","Type":"ContainerStarted","Data":"24c00d7cae2e6199d11cffe98d1b798798cc947e0c355745f7af2352b67afd58"} Mar 07 04:42:50 crc kubenswrapper[4689]: I0307 04:42:50.586984 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3153-account-delete-m764h" Mar 07 04:42:50 crc kubenswrapper[4689]: I0307 04:42:50.637657 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fm5h\" (UniqueName: \"kubernetes.io/projected/a599f088-e3d4-4307-82bb-15f128953741-kube-api-access-9fm5h\") pod \"a599f088-e3d4-4307-82bb-15f128953741\" (UID: \"a599f088-e3d4-4307-82bb-15f128953741\") " Mar 07 04:42:50 crc kubenswrapper[4689]: I0307 04:42:50.637920 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a599f088-e3d4-4307-82bb-15f128953741-operator-scripts\") pod \"a599f088-e3d4-4307-82bb-15f128953741\" (UID: \"a599f088-e3d4-4307-82bb-15f128953741\") " Mar 07 04:42:50 crc kubenswrapper[4689]: I0307 04:42:50.638591 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a599f088-e3d4-4307-82bb-15f128953741-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a599f088-e3d4-4307-82bb-15f128953741" (UID: "a599f088-e3d4-4307-82bb-15f128953741"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:42:50 crc kubenswrapper[4689]: I0307 04:42:50.645729 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a599f088-e3d4-4307-82bb-15f128953741-kube-api-access-9fm5h" (OuterVolumeSpecName: "kube-api-access-9fm5h") pod "a599f088-e3d4-4307-82bb-15f128953741" (UID: "a599f088-e3d4-4307-82bb-15f128953741"). InnerVolumeSpecName "kube-api-access-9fm5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:42:50 crc kubenswrapper[4689]: I0307 04:42:50.739119 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a599f088-e3d4-4307-82bb-15f128953741-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:50 crc kubenswrapper[4689]: I0307 04:42:50.739161 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fm5h\" (UniqueName: \"kubernetes.io/projected/a599f088-e3d4-4307-82bb-15f128953741-kube-api-access-9fm5h\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.041071 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Mar 07 04:42:51 crc kubenswrapper[4689]: E0307 04:42:51.041371 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a599f088-e3d4-4307-82bb-15f128953741" containerName="mariadb-account-delete" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.041387 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="a599f088-e3d4-4307-82bb-15f128953741" containerName="mariadb-account-delete" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.041556 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="a599f088-e3d4-4307-82bb-15f128953741" containerName="mariadb-account-delete" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.042075 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.046636 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.046710 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-q4kfd" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.046827 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.046973 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-6k9k4c8bfg" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.054833 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.143379 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhgcx\" (UniqueName: \"kubernetes.io/projected/3c25a937-0d93-4077-92d7-fbeac4f6abb3-kube-api-access-zhgcx\") pod \"openstackclient\" (UID: \"3c25a937-0d93-4077-92d7-fbeac4f6abb3\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.143456 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-scripts\") pod \"openstackclient\" (UID: \"3c25a937-0d93-4077-92d7-fbeac4f6abb3\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.143485 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c25a937-0d93-4077-92d7-fbeac4f6abb3\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.143541 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config\") pod \"openstackclient\" (UID: \"3c25a937-0d93-4077-92d7-fbeac4f6abb3\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.244483 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-scripts\") pod \"openstackclient\" (UID: \"3c25a937-0d93-4077-92d7-fbeac4f6abb3\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.244554 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c25a937-0d93-4077-92d7-fbeac4f6abb3\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.244669 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config\") pod \"openstackclient\" (UID: \"3c25a937-0d93-4077-92d7-fbeac4f6abb3\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.244737 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhgcx\" (UniqueName: \"kubernetes.io/projected/3c25a937-0d93-4077-92d7-fbeac4f6abb3-kube-api-access-zhgcx\") pod \"openstackclient\" (UID: \"3c25a937-0d93-4077-92d7-fbeac4f6abb3\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.245731 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-scripts\") pod \"openstackclient\" (UID: \"3c25a937-0d93-4077-92d7-fbeac4f6abb3\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.246317 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config\") pod \"openstackclient\" (UID: \"3c25a937-0d93-4077-92d7-fbeac4f6abb3\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.246471 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3153-account-delete-m764h" event={"ID":"a599f088-e3d4-4307-82bb-15f128953741","Type":"ContainerDied","Data":"24c00d7cae2e6199d11cffe98d1b798798cc947e0c355745f7af2352b67afd58"} Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.246528 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24c00d7cae2e6199d11cffe98d1b798798cc947e0c355745f7af2352b67afd58" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.246564 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3153-account-delete-m764h" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.249413 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c25a937-0d93-4077-92d7-fbeac4f6abb3\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.270000 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhgcx\" (UniqueName: \"kubernetes.io/projected/3c25a937-0d93-4077-92d7-fbeac4f6abb3-kube-api-access-zhgcx\") pod \"openstackclient\" (UID: \"3c25a937-0d93-4077-92d7-fbeac4f6abb3\") " pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.378946 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.540750 4689 scope.go:117] "RemoveContainer" containerID="3f4f0fed307167477eefb030f2243db52d2ea03369ae3e197f195f92174acb2f" Mar 07 04:42:51 crc kubenswrapper[4689]: I0307 04:42:51.810095 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Mar 07 04:42:51 crc kubenswrapper[4689]: W0307 04:42:51.824057 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c25a937_0d93_4077_92d7_fbeac4f6abb3.slice/crio-47c643334f34f60be96d9ff6de82ba4e04684da0affbd208857af5a8b41e93e9 WatchSource:0}: Error finding container 47c643334f34f60be96d9ff6de82ba4e04684da0affbd208857af5a8b41e93e9: Status 404 returned error can't find the container with id 47c643334f34f60be96d9ff6de82ba4e04684da0affbd208857af5a8b41e93e9 Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.263213 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"3c25a937-0d93-4077-92d7-fbeac4f6abb3","Type":"ContainerStarted","Data":"421147f82a51e07f13b6d990a56f8edcdf79a653a3a3b17e40930a71ee970934"} Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.263293 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"3c25a937-0d93-4077-92d7-fbeac4f6abb3","Type":"ContainerStarted","Data":"47c643334f34f60be96d9ff6de82ba4e04684da0affbd208857af5a8b41e93e9"} Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.289403 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=1.28938147 podStartE2EDuration="1.28938147s" podCreationTimestamp="2026-03-07 04:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:42:52.28641941 +0000 UTC m=+1417.332802959" watchObservedRunningTime="2026-03-07 04:42:52.28938147 +0000 UTC m=+1417.335764969" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.412523 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-s6mwz"] Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.418969 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-s6mwz"] Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.443692 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance3153-account-delete-m764h"] Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.452423 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance3153-account-delete-m764h"] Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.462633 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-3153-account-create-update-p4bp6"] Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.472128 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-3153-account-create-update-p4bp6"] Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.612032 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-65kgs"] Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.613877 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-65kgs" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.663770 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-65kgs"] Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.667832 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtmnl\" (UniqueName: \"kubernetes.io/projected/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd-kube-api-access-vtmnl\") pod \"glance-db-create-65kgs\" (UID: \"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd\") " pod="glance-kuttl-tests/glance-db-create-65kgs" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.667969 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd-operator-scripts\") pod \"glance-db-create-65kgs\" (UID: \"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd\") " pod="glance-kuttl-tests/glance-db-create-65kgs" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.672819 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-1dad-account-create-update-b4266"] Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.673947 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.678994 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.694826 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-1dad-account-create-update-b4266"] Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.769190 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd-operator-scripts\") pod \"glance-db-create-65kgs\" (UID: \"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd\") " pod="glance-kuttl-tests/glance-db-create-65kgs" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.769513 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtmnl\" (UniqueName: \"kubernetes.io/projected/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd-kube-api-access-vtmnl\") pod \"glance-db-create-65kgs\" (UID: \"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd\") " pod="glance-kuttl-tests/glance-db-create-65kgs" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.769787 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd-operator-scripts\") pod \"glance-db-create-65kgs\" (UID: \"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd\") " pod="glance-kuttl-tests/glance-db-create-65kgs" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.786203 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtmnl\" (UniqueName: \"kubernetes.io/projected/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd-kube-api-access-vtmnl\") pod \"glance-db-create-65kgs\" (UID: \"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd\") " pod="glance-kuttl-tests/glance-db-create-65kgs" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.871299 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2d73e07-546c-4f2f-8802-ba074301609e-operator-scripts\") pod \"glance-1dad-account-create-update-b4266\" (UID: \"a2d73e07-546c-4f2f-8802-ba074301609e\") " pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.871430 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bfcg\" (UniqueName: \"kubernetes.io/projected/a2d73e07-546c-4f2f-8802-ba074301609e-kube-api-access-4bfcg\") pod \"glance-1dad-account-create-update-b4266\" (UID: \"a2d73e07-546c-4f2f-8802-ba074301609e\") " pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.972161 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bfcg\" (UniqueName: \"kubernetes.io/projected/a2d73e07-546c-4f2f-8802-ba074301609e-kube-api-access-4bfcg\") pod \"glance-1dad-account-create-update-b4266\" (UID: \"a2d73e07-546c-4f2f-8802-ba074301609e\") " pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.972484 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2d73e07-546c-4f2f-8802-ba074301609e-operator-scripts\") pod \"glance-1dad-account-create-update-b4266\" (UID: \"a2d73e07-546c-4f2f-8802-ba074301609e\") " pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.973933 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2d73e07-546c-4f2f-8802-ba074301609e-operator-scripts\") pod \"glance-1dad-account-create-update-b4266\" (UID: \"a2d73e07-546c-4f2f-8802-ba074301609e\") " pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.985923 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-65kgs" Mar 07 04:42:52 crc kubenswrapper[4689]: I0307 04:42:52.993159 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bfcg\" (UniqueName: \"kubernetes.io/projected/a2d73e07-546c-4f2f-8802-ba074301609e-kube-api-access-4bfcg\") pod \"glance-1dad-account-create-update-b4266\" (UID: \"a2d73e07-546c-4f2f-8802-ba074301609e\") " pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" Mar 07 04:42:53 crc kubenswrapper[4689]: I0307 04:42:53.003153 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" Mar 07 04:42:53 crc kubenswrapper[4689]: I0307 04:42:53.306898 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-1dad-account-create-update-b4266"] Mar 07 04:42:53 crc kubenswrapper[4689]: I0307 04:42:53.425036 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-65kgs"] Mar 07 04:42:53 crc kubenswrapper[4689]: W0307 04:42:53.427909 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11f632ac_bf0e_40d0_bcf8_4d5ed1893ccd.slice/crio-57be728eca45efd5cbd96ff12fb23588193b63ff430e477c694b9436a49b1229 WatchSource:0}: Error finding container 57be728eca45efd5cbd96ff12fb23588193b63ff430e477c694b9436a49b1229: Status 404 returned error can't find the container with id 57be728eca45efd5cbd96ff12fb23588193b63ff430e477c694b9436a49b1229 Mar 07 04:42:53 crc kubenswrapper[4689]: I0307 04:42:53.835579 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a599f088-e3d4-4307-82bb-15f128953741" path="/var/lib/kubelet/pods/a599f088-e3d4-4307-82bb-15f128953741/volumes" Mar 07 04:42:53 crc kubenswrapper[4689]: I0307 04:42:53.836121 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="badeb223-8f16-446f-8b09-ee99b5fc8ee7" path="/var/lib/kubelet/pods/badeb223-8f16-446f-8b09-ee99b5fc8ee7/volumes" Mar 07 04:42:53 crc kubenswrapper[4689]: I0307 04:42:53.836629 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2e0245c-2824-46a4-8c46-e5dabccff5e5" path="/var/lib/kubelet/pods/d2e0245c-2824-46a4-8c46-e5dabccff5e5/volumes" Mar 07 04:42:54 crc kubenswrapper[4689]: I0307 04:42:54.286226 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" event={"ID":"a2d73e07-546c-4f2f-8802-ba074301609e","Type":"ContainerStarted","Data":"fae9eaad49e77e33096b5f19d267d45f257700c94bf685adb23a624bdff28d38"} Mar 07 04:42:54 crc kubenswrapper[4689]: I0307 04:42:54.286268 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" event={"ID":"a2d73e07-546c-4f2f-8802-ba074301609e","Type":"ContainerStarted","Data":"c982e9f8b4abb13b92dde35802d12e98f0248c4af8634b8a1f4aa46615b9ec84"} Mar 07 04:42:54 crc kubenswrapper[4689]: I0307 04:42:54.288018 4689 generic.go:334] "Generic (PLEG): container finished" podID="11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd" containerID="55ba98ad14b1559c0c895a472e1d83598f8fa20fb19f81b4c7f62b161471175f" exitCode=0 Mar 07 04:42:54 crc kubenswrapper[4689]: I0307 04:42:54.288077 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-65kgs" event={"ID":"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd","Type":"ContainerDied","Data":"55ba98ad14b1559c0c895a472e1d83598f8fa20fb19f81b4c7f62b161471175f"} Mar 07 04:42:54 crc kubenswrapper[4689]: I0307 04:42:54.288111 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-65kgs" event={"ID":"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd","Type":"ContainerStarted","Data":"57be728eca45efd5cbd96ff12fb23588193b63ff430e477c694b9436a49b1229"} Mar 07 04:42:54 crc kubenswrapper[4689]: I0307 04:42:54.308927 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" podStartSLOduration=2.308905478 podStartE2EDuration="2.308905478s" podCreationTimestamp="2026-03-07 04:42:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:42:54.307791928 +0000 UTC m=+1419.354175467" watchObservedRunningTime="2026-03-07 04:42:54.308905478 +0000 UTC m=+1419.355288977" Mar 07 04:42:55 crc kubenswrapper[4689]: I0307 04:42:55.299265 4689 generic.go:334] "Generic (PLEG): container finished" podID="a2d73e07-546c-4f2f-8802-ba074301609e" containerID="fae9eaad49e77e33096b5f19d267d45f257700c94bf685adb23a624bdff28d38" exitCode=0 Mar 07 04:42:55 crc kubenswrapper[4689]: I0307 04:42:55.299408 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" event={"ID":"a2d73e07-546c-4f2f-8802-ba074301609e","Type":"ContainerDied","Data":"fae9eaad49e77e33096b5f19d267d45f257700c94bf685adb23a624bdff28d38"} Mar 07 04:42:55 crc kubenswrapper[4689]: I0307 04:42:55.605707 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-65kgs" Mar 07 04:42:55 crc kubenswrapper[4689]: I0307 04:42:55.616335 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd-operator-scripts\") pod \"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd\" (UID: \"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd\") " Mar 07 04:42:55 crc kubenswrapper[4689]: I0307 04:42:55.616390 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtmnl\" (UniqueName: \"kubernetes.io/projected/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd-kube-api-access-vtmnl\") pod \"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd\" (UID: \"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd\") " Mar 07 04:42:55 crc kubenswrapper[4689]: I0307 04:42:55.617160 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd" (UID: "11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:42:55 crc kubenswrapper[4689]: I0307 04:42:55.624446 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd-kube-api-access-vtmnl" (OuterVolumeSpecName: "kube-api-access-vtmnl") pod "11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd" (UID: "11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd"). InnerVolumeSpecName "kube-api-access-vtmnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:42:55 crc kubenswrapper[4689]: I0307 04:42:55.718194 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:55 crc kubenswrapper[4689]: I0307 04:42:55.718497 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtmnl\" (UniqueName: \"kubernetes.io/projected/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd-kube-api-access-vtmnl\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:56 crc kubenswrapper[4689]: I0307 04:42:56.310842 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-65kgs" event={"ID":"11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd","Type":"ContainerDied","Data":"57be728eca45efd5cbd96ff12fb23588193b63ff430e477c694b9436a49b1229"} Mar 07 04:42:56 crc kubenswrapper[4689]: I0307 04:42:56.310870 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-65kgs" Mar 07 04:42:56 crc kubenswrapper[4689]: I0307 04:42:56.310886 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57be728eca45efd5cbd96ff12fb23588193b63ff430e477c694b9436a49b1229" Mar 07 04:42:56 crc kubenswrapper[4689]: I0307 04:42:56.595938 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" Mar 07 04:42:56 crc kubenswrapper[4689]: I0307 04:42:56.632742 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bfcg\" (UniqueName: \"kubernetes.io/projected/a2d73e07-546c-4f2f-8802-ba074301609e-kube-api-access-4bfcg\") pod \"a2d73e07-546c-4f2f-8802-ba074301609e\" (UID: \"a2d73e07-546c-4f2f-8802-ba074301609e\") " Mar 07 04:42:56 crc kubenswrapper[4689]: I0307 04:42:56.632965 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2d73e07-546c-4f2f-8802-ba074301609e-operator-scripts\") pod \"a2d73e07-546c-4f2f-8802-ba074301609e\" (UID: \"a2d73e07-546c-4f2f-8802-ba074301609e\") " Mar 07 04:42:56 crc kubenswrapper[4689]: I0307 04:42:56.634036 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d73e07-546c-4f2f-8802-ba074301609e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2d73e07-546c-4f2f-8802-ba074301609e" (UID: "a2d73e07-546c-4f2f-8802-ba074301609e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:42:56 crc kubenswrapper[4689]: I0307 04:42:56.638941 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d73e07-546c-4f2f-8802-ba074301609e-kube-api-access-4bfcg" (OuterVolumeSpecName: "kube-api-access-4bfcg") pod "a2d73e07-546c-4f2f-8802-ba074301609e" (UID: "a2d73e07-546c-4f2f-8802-ba074301609e"). InnerVolumeSpecName "kube-api-access-4bfcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:42:56 crc kubenswrapper[4689]: I0307 04:42:56.734817 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2d73e07-546c-4f2f-8802-ba074301609e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:56 crc kubenswrapper[4689]: I0307 04:42:56.734870 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bfcg\" (UniqueName: \"kubernetes.io/projected/a2d73e07-546c-4f2f-8802-ba074301609e-kube-api-access-4bfcg\") on node \"crc\" DevicePath \"\"" Mar 07 04:42:57 crc kubenswrapper[4689]: I0307 04:42:57.322716 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" event={"ID":"a2d73e07-546c-4f2f-8802-ba074301609e","Type":"ContainerDied","Data":"c982e9f8b4abb13b92dde35802d12e98f0248c4af8634b8a1f4aa46615b9ec84"} Mar 07 04:42:57 crc kubenswrapper[4689]: I0307 04:42:57.322753 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c982e9f8b4abb13b92dde35802d12e98f0248c4af8634b8a1f4aa46615b9ec84" Mar 07 04:42:57 crc kubenswrapper[4689]: I0307 04:42:57.322765 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-1dad-account-create-update-b4266" Mar 07 04:42:57 crc kubenswrapper[4689]: I0307 04:42:57.863997 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-sjcmd"] Mar 07 04:42:57 crc kubenswrapper[4689]: E0307 04:42:57.864849 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd" containerName="mariadb-database-create" Mar 07 04:42:57 crc kubenswrapper[4689]: I0307 04:42:57.864874 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd" containerName="mariadb-database-create" Mar 07 04:42:57 crc kubenswrapper[4689]: E0307 04:42:57.864917 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d73e07-546c-4f2f-8802-ba074301609e" containerName="mariadb-account-create-update" Mar 07 04:42:57 crc kubenswrapper[4689]: I0307 04:42:57.864930 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d73e07-546c-4f2f-8802-ba074301609e" containerName="mariadb-account-create-update" Mar 07 04:42:57 crc kubenswrapper[4689]: I0307 04:42:57.865195 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d73e07-546c-4f2f-8802-ba074301609e" containerName="mariadb-account-create-update" Mar 07 04:42:57 crc kubenswrapper[4689]: I0307 04:42:57.865219 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd" containerName="mariadb-database-create" Mar 07 04:42:57 crc kubenswrapper[4689]: I0307 04:42:57.866009 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:42:57 crc kubenswrapper[4689]: I0307 04:42:57.867841 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Mar 07 04:42:57 crc kubenswrapper[4689]: I0307 04:42:57.868928 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-ldjhh" Mar 07 04:42:57 crc kubenswrapper[4689]: I0307 04:42:57.877502 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-sjcmd"] Mar 07 04:42:58 crc kubenswrapper[4689]: I0307 04:42:58.055530 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9g7s\" (UniqueName: \"kubernetes.io/projected/e6740dbc-96ca-41c6-865d-c2c2e195c954-kube-api-access-f9g7s\") pod \"glance-db-sync-sjcmd\" (UID: \"e6740dbc-96ca-41c6-865d-c2c2e195c954\") " pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:42:58 crc kubenswrapper[4689]: I0307 04:42:58.055630 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6740dbc-96ca-41c6-865d-c2c2e195c954-db-sync-config-data\") pod \"glance-db-sync-sjcmd\" (UID: \"e6740dbc-96ca-41c6-865d-c2c2e195c954\") " pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:42:58 crc kubenswrapper[4689]: I0307 04:42:58.055699 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6740dbc-96ca-41c6-865d-c2c2e195c954-config-data\") pod \"glance-db-sync-sjcmd\" (UID: \"e6740dbc-96ca-41c6-865d-c2c2e195c954\") " pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:42:58 crc kubenswrapper[4689]: I0307 04:42:58.157533 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9g7s\" (UniqueName: \"kubernetes.io/projected/e6740dbc-96ca-41c6-865d-c2c2e195c954-kube-api-access-f9g7s\") pod \"glance-db-sync-sjcmd\" (UID: \"e6740dbc-96ca-41c6-865d-c2c2e195c954\") " pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:42:58 crc kubenswrapper[4689]: I0307 04:42:58.157628 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6740dbc-96ca-41c6-865d-c2c2e195c954-db-sync-config-data\") pod \"glance-db-sync-sjcmd\" (UID: \"e6740dbc-96ca-41c6-865d-c2c2e195c954\") " pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:42:58 crc kubenswrapper[4689]: I0307 04:42:58.157688 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6740dbc-96ca-41c6-865d-c2c2e195c954-config-data\") pod \"glance-db-sync-sjcmd\" (UID: \"e6740dbc-96ca-41c6-865d-c2c2e195c954\") " pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:42:58 crc kubenswrapper[4689]: I0307 04:42:58.162313 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6740dbc-96ca-41c6-865d-c2c2e195c954-config-data\") pod \"glance-db-sync-sjcmd\" (UID: \"e6740dbc-96ca-41c6-865d-c2c2e195c954\") " pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:42:58 crc kubenswrapper[4689]: I0307 04:42:58.163116 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6740dbc-96ca-41c6-865d-c2c2e195c954-db-sync-config-data\") pod \"glance-db-sync-sjcmd\" (UID: \"e6740dbc-96ca-41c6-865d-c2c2e195c954\") " pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:42:58 crc kubenswrapper[4689]: I0307 04:42:58.176555 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9g7s\" (UniqueName: \"kubernetes.io/projected/e6740dbc-96ca-41c6-865d-c2c2e195c954-kube-api-access-f9g7s\") pod \"glance-db-sync-sjcmd\" (UID: \"e6740dbc-96ca-41c6-865d-c2c2e195c954\") " pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:42:58 crc kubenswrapper[4689]: I0307 04:42:58.186577 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:42:58 crc kubenswrapper[4689]: I0307 04:42:58.653060 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-sjcmd"] Mar 07 04:42:59 crc kubenswrapper[4689]: I0307 04:42:59.341616 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-sjcmd" event={"ID":"e6740dbc-96ca-41c6-865d-c2c2e195c954","Type":"ContainerStarted","Data":"485bf29ff602d11194b64881629d8f1a13fac4096de96657cfdfffc1a68505ce"} Mar 07 04:42:59 crc kubenswrapper[4689]: I0307 04:42:59.341965 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-sjcmd" event={"ID":"e6740dbc-96ca-41c6-865d-c2c2e195c954","Type":"ContainerStarted","Data":"0feee058c5d3e5b6eae022bf5de70654a0c71d3b5daf44e25e4d3a50cc6c36e3"} Mar 07 04:43:00 crc kubenswrapper[4689]: I0307 04:43:00.379225 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-sjcmd" podStartSLOduration=3.379199868 podStartE2EDuration="3.379199868s" podCreationTimestamp="2026-03-07 04:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:43:00.37778754 +0000 UTC m=+1425.424171039" watchObservedRunningTime="2026-03-07 04:43:00.379199868 +0000 UTC m=+1425.425583387" Mar 07 04:43:02 crc kubenswrapper[4689]: I0307 04:43:02.383796 4689 generic.go:334] "Generic (PLEG): container finished" podID="e6740dbc-96ca-41c6-865d-c2c2e195c954" containerID="485bf29ff602d11194b64881629d8f1a13fac4096de96657cfdfffc1a68505ce" exitCode=0 Mar 07 04:43:02 crc kubenswrapper[4689]: I0307 04:43:02.384092 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-sjcmd" event={"ID":"e6740dbc-96ca-41c6-865d-c2c2e195c954","Type":"ContainerDied","Data":"485bf29ff602d11194b64881629d8f1a13fac4096de96657cfdfffc1a68505ce"} Mar 07 04:43:03 crc kubenswrapper[4689]: I0307 04:43:03.725085 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:43:03 crc kubenswrapper[4689]: I0307 04:43:03.740142 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6740dbc-96ca-41c6-865d-c2c2e195c954-db-sync-config-data\") pod \"e6740dbc-96ca-41c6-865d-c2c2e195c954\" (UID: \"e6740dbc-96ca-41c6-865d-c2c2e195c954\") " Mar 07 04:43:03 crc kubenswrapper[4689]: I0307 04:43:03.740245 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6740dbc-96ca-41c6-865d-c2c2e195c954-config-data\") pod \"e6740dbc-96ca-41c6-865d-c2c2e195c954\" (UID: \"e6740dbc-96ca-41c6-865d-c2c2e195c954\") " Mar 07 04:43:03 crc kubenswrapper[4689]: I0307 04:43:03.740308 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9g7s\" (UniqueName: \"kubernetes.io/projected/e6740dbc-96ca-41c6-865d-c2c2e195c954-kube-api-access-f9g7s\") pod \"e6740dbc-96ca-41c6-865d-c2c2e195c954\" (UID: \"e6740dbc-96ca-41c6-865d-c2c2e195c954\") " Mar 07 04:43:03 crc kubenswrapper[4689]: I0307 04:43:03.746753 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6740dbc-96ca-41c6-865d-c2c2e195c954-kube-api-access-f9g7s" (OuterVolumeSpecName: "kube-api-access-f9g7s") pod "e6740dbc-96ca-41c6-865d-c2c2e195c954" (UID: "e6740dbc-96ca-41c6-865d-c2c2e195c954"). InnerVolumeSpecName "kube-api-access-f9g7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:43:03 crc kubenswrapper[4689]: I0307 04:43:03.746821 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6740dbc-96ca-41c6-865d-c2c2e195c954-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e6740dbc-96ca-41c6-865d-c2c2e195c954" (UID: "e6740dbc-96ca-41c6-865d-c2c2e195c954"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:43:03 crc kubenswrapper[4689]: I0307 04:43:03.790766 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6740dbc-96ca-41c6-865d-c2c2e195c954-config-data" (OuterVolumeSpecName: "config-data") pod "e6740dbc-96ca-41c6-865d-c2c2e195c954" (UID: "e6740dbc-96ca-41c6-865d-c2c2e195c954"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:43:03 crc kubenswrapper[4689]: I0307 04:43:03.842271 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6740dbc-96ca-41c6-865d-c2c2e195c954-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:03 crc kubenswrapper[4689]: I0307 04:43:03.842488 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9g7s\" (UniqueName: \"kubernetes.io/projected/e6740dbc-96ca-41c6-865d-c2c2e195c954-kube-api-access-f9g7s\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:03 crc kubenswrapper[4689]: I0307 04:43:03.842549 4689 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6740dbc-96ca-41c6-865d-c2c2e195c954-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:04 crc kubenswrapper[4689]: I0307 04:43:04.399771 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-sjcmd" event={"ID":"e6740dbc-96ca-41c6-865d-c2c2e195c954","Type":"ContainerDied","Data":"0feee058c5d3e5b6eae022bf5de70654a0c71d3b5daf44e25e4d3a50cc6c36e3"} Mar 07 04:43:04 crc kubenswrapper[4689]: I0307 04:43:04.400081 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0feee058c5d3e5b6eae022bf5de70654a0c71d3b5daf44e25e4d3a50cc6c36e3" Mar 07 04:43:04 crc kubenswrapper[4689]: I0307 04:43:04.399835 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-sjcmd" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.519803 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:43:05 crc kubenswrapper[4689]: E0307 04:43:05.520397 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6740dbc-96ca-41c6-865d-c2c2e195c954" containerName="glance-db-sync" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.520413 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6740dbc-96ca-41c6-865d-c2c2e195c954" containerName="glance-db-sync" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.520582 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6740dbc-96ca-41c6-865d-c2c2e195c954" containerName="glance-db-sync" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.521461 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.524159 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-ldjhh" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.526026 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.529918 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.546836 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.680336 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7431fc02-96c1-4a55-aad6-83c23610f7a0-config-data\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.680390 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt8dh\" (UniqueName: \"kubernetes.io/projected/7431fc02-96c1-4a55-aad6-83c23610f7a0-kube-api-access-mt8dh\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.680454 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-dev\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.680478 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-run\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.680524 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.680610 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7431fc02-96c1-4a55-aad6-83c23610f7a0-logs\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.680656 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7431fc02-96c1-4a55-aad6-83c23610f7a0-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.680749 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.680788 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-sys\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.680865 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.680900 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.680922 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.681082 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7431fc02-96c1-4a55-aad6-83c23610f7a0-scripts\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.681140 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.728679 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.729773 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.758019 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.782906 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7431fc02-96c1-4a55-aad6-83c23610f7a0-logs\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.782961 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7431fc02-96c1-4a55-aad6-83c23610f7a0-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783030 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783056 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-sys\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783094 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783118 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783138 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783186 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783207 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7431fc02-96c1-4a55-aad6-83c23610f7a0-scripts\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783215 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783232 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7431fc02-96c1-4a55-aad6-83c23610f7a0-config-data\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783430 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-sys\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783466 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783486 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt8dh\" (UniqueName: \"kubernetes.io/projected/7431fc02-96c1-4a55-aad6-83c23610f7a0-kube-api-access-mt8dh\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783549 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783573 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-dev\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783589 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783605 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-run\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783625 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7431fc02-96c1-4a55-aad6-83c23610f7a0-logs\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783629 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783663 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-run\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783698 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-dev\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783735 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783813 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7431fc02-96c1-4a55-aad6-83c23610f7a0-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.783836 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.806734 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7431fc02-96c1-4a55-aad6-83c23610f7a0-scripts\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.814247 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.815457 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7431fc02-96c1-4a55-aad6-83c23610f7a0-config-data\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.815555 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.815651 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt8dh\" (UniqueName: \"kubernetes.io/projected/7431fc02-96c1-4a55-aad6-83c23610f7a0-kube-api-access-mt8dh\") pod \"glance-default-external-api-1\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.841271 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.872955 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.875850 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.877487 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.877660 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.878461 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.880761 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.883877 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.884675 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfp4v\" (UniqueName: \"kubernetes.io/projected/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-kube-api-access-jfp4v\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.884706 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.884746 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-sys\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.884768 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.884783 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-logs\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.884812 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.884839 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.884869 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.884883 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.884900 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-dev\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.884959 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.884983 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-run\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.885005 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.885025 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.989621 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-sys\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990054 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.989768 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-sys\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990154 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990291 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-logs\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990315 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990337 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990376 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990406 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990421 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a34f4288-15d9-4ffc-8ff3-1a9f05001339-logs\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990436 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-dev\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990449 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a34f4288-15d9-4ffc-8ff3-1a9f05001339-scripts\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990465 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990483 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-sys\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990502 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea324bb-0c4d-4636-a821-07077ec1c6cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990525 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990539 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea324bb-0c4d-4636-a821-07077ec1c6cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990561 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990591 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-run\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990613 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-dev\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990628 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a34f4288-15d9-4ffc-8ff3-1a9f05001339-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990646 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990660 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990687 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-dev\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990703 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44fmm\" (UniqueName: \"kubernetes.io/projected/eea324bb-0c4d-4636-a821-07077ec1c6cf-kube-api-access-44fmm\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990727 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990742 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-sys\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990774 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990793 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990816 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea324bb-0c4d-4636-a821-07077ec1c6cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990841 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-run\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990855 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz27s\" (UniqueName: \"kubernetes.io/projected/a34f4288-15d9-4ffc-8ff3-1a9f05001339-kube-api-access-lz27s\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990880 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34f4288-15d9-4ffc-8ff3-1a9f05001339-config-data\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.990877 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-logs\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.991719 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-run\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.991729 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.991751 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-dev\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.991774 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.991819 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.991878 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.991901 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.991935 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.991943 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.991975 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.992089 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.992494 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eea324bb-0c4d-4636-a821-07077ec1c6cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.992659 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfp4v\" (UniqueName: \"kubernetes.io/projected/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-kube-api-access-jfp4v\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.992688 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.992731 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-run\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.992789 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.992815 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.992835 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.993236 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:05 crc kubenswrapper[4689]: I0307 04:43:05.995768 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.017491 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.018843 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfp4v\" (UniqueName: \"kubernetes.io/projected/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-kube-api-access-jfp4v\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.024655 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.036487 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-0\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.046101 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.096933 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.096997 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-sys\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097022 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097057 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea324bb-0c4d-4636-a821-07077ec1c6cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097083 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz27s\" (UniqueName: \"kubernetes.io/projected/a34f4288-15d9-4ffc-8ff3-1a9f05001339-kube-api-access-lz27s\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097101 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34f4288-15d9-4ffc-8ff3-1a9f05001339-config-data\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097120 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097140 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eea324bb-0c4d-4636-a821-07077ec1c6cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097180 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-run\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097201 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097217 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097232 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097252 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097272 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097288 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097310 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097329 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a34f4288-15d9-4ffc-8ff3-1a9f05001339-logs\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097345 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-dev\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097361 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a34f4288-15d9-4ffc-8ff3-1a9f05001339-scripts\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097375 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097390 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-sys\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097408 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea324bb-0c4d-4636-a821-07077ec1c6cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097426 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097441 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea324bb-0c4d-4636-a821-07077ec1c6cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097462 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-run\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097479 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a34f4288-15d9-4ffc-8ff3-1a9f05001339-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097494 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-dev\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097515 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44fmm\" (UniqueName: \"kubernetes.io/projected/eea324bb-0c4d-4636-a821-07077ec1c6cf-kube-api-access-44fmm\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.097988 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.099270 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.099331 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-sys\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.099382 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.102866 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-sys\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.102929 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.102950 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.103274 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a34f4288-15d9-4ffc-8ff3-1a9f05001339-logs\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.103307 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-dev\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.103646 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-run\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.104821 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea324bb-0c4d-4636-a821-07077ec1c6cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.104917 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.105728 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.105978 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a34f4288-15d9-4ffc-8ff3-1a9f05001339-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.106012 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-run\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.106047 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.106282 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eea324bb-0c4d-4636-a821-07077ec1c6cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.106316 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.106374 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.106491 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.106559 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.106666 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-dev\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.112814 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea324bb-0c4d-4636-a821-07077ec1c6cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.113331 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a34f4288-15d9-4ffc-8ff3-1a9f05001339-scripts\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.123927 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea324bb-0c4d-4636-a821-07077ec1c6cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.126665 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34f4288-15d9-4ffc-8ff3-1a9f05001339-config-data\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.133188 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz27s\" (UniqueName: \"kubernetes.io/projected/a34f4288-15d9-4ffc-8ff3-1a9f05001339-kube-api-access-lz27s\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.148497 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44fmm\" (UniqueName: \"kubernetes.io/projected/eea324bb-0c4d-4636-a821-07077ec1c6cf-kube-api-access-44fmm\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.169665 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.231816 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.233563 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-1\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.237813 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.243073 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.322089 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.431262 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"7431fc02-96c1-4a55-aad6-83c23610f7a0","Type":"ContainerStarted","Data":"e34652d4c3bab94f17b2708c2978a717754804203f62e845d54ae15d85fac95b"} Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.535915 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.577753 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.688583 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:43:06 crc kubenswrapper[4689]: I0307 04:43:06.720244 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:43:06 crc kubenswrapper[4689]: W0307 04:43:06.729497 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda34f4288_15d9_4ffc_8ff3_1a9f05001339.slice/crio-d7a32e76c442025d6158a867badffa6c1cb5153910fec12674050878bf288027 WatchSource:0}: Error finding container d7a32e76c442025d6158a867badffa6c1cb5153910fec12674050878bf288027: Status 404 returned error can't find the container with id d7a32e76c442025d6158a867badffa6c1cb5153910fec12674050878bf288027 Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.075604 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.440420 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"a34f4288-15d9-4ffc-8ff3-1a9f05001339","Type":"ContainerStarted","Data":"14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107"} Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.441125 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"a34f4288-15d9-4ffc-8ff3-1a9f05001339","Type":"ContainerStarted","Data":"fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed"} Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.440491 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="a34f4288-15d9-4ffc-8ff3-1a9f05001339" containerName="glance-log" containerID="cri-o://fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed" gracePeriod=30 Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.441143 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"a34f4288-15d9-4ffc-8ff3-1a9f05001339","Type":"ContainerStarted","Data":"d7a32e76c442025d6158a867badffa6c1cb5153910fec12674050878bf288027"} Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.440739 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="a34f4288-15d9-4ffc-8ff3-1a9f05001339" containerName="glance-httpd" containerID="cri-o://14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107" gracePeriod=30 Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.445357 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"7431fc02-96c1-4a55-aad6-83c23610f7a0","Type":"ContainerStarted","Data":"15dd62250ce33307a42451ba46c49b5fd8fb925435c7bac5008c462104124a1e"} Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.445412 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"7431fc02-96c1-4a55-aad6-83c23610f7a0","Type":"ContainerStarted","Data":"199a67cd230a4ed0590560713085f31003749707b102262e412decacd3bc5f8a"} Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.447406 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"eea324bb-0c4d-4636-a821-07077ec1c6cf","Type":"ContainerStarted","Data":"34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1"} Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.447477 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"eea324bb-0c4d-4636-a821-07077ec1c6cf","Type":"ContainerStarted","Data":"4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d"} Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.447497 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"eea324bb-0c4d-4636-a821-07077ec1c6cf","Type":"ContainerStarted","Data":"f526f1ea024599ad5c40a21b77bcb68d705bb7e332d6f79c4ef6ec8f4728a0b9"} Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.451357 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5","Type":"ContainerStarted","Data":"5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7"} Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.451430 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5","Type":"ContainerStarted","Data":"ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d"} Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.451459 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5","Type":"ContainerStarted","Data":"0ea09f2f4047efa72a4e4b231231f61e4cf74e46184c573fe1ac5e45379297d1"} Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.466888 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.466863577 podStartE2EDuration="3.466863577s" podCreationTimestamp="2026-03-07 04:43:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:43:07.462331145 +0000 UTC m=+1432.508714634" watchObservedRunningTime="2026-03-07 04:43:07.466863577 +0000 UTC m=+1432.513247066" Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.506134 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=2.506119477 podStartE2EDuration="2.506119477s" podCreationTimestamp="2026-03-07 04:43:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:43:07.500234358 +0000 UTC m=+1432.546617847" watchObservedRunningTime="2026-03-07 04:43:07.506119477 +0000 UTC m=+1432.552502966" Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.549361 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.549343004 podStartE2EDuration="3.549343004s" podCreationTimestamp="2026-03-07 04:43:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:43:07.53958003 +0000 UTC m=+1432.585963519" watchObservedRunningTime="2026-03-07 04:43:07.549343004 +0000 UTC m=+1432.595726493" Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.577033 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.577010661 podStartE2EDuration="3.577010661s" podCreationTimestamp="2026-03-07 04:43:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:43:07.565568982 +0000 UTC m=+1432.611952461" watchObservedRunningTime="2026-03-07 04:43:07.577010661 +0000 UTC m=+1432.623394160" Mar 07 04:43:07 crc kubenswrapper[4689]: I0307 04:43:07.959873 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.064990 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a34f4288-15d9-4ffc-8ff3-1a9f05001339-httpd-run\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065068 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065111 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-dev\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065137 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-lib-modules\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065158 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a34f4288-15d9-4ffc-8ff3-1a9f05001339-scripts\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065190 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065211 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-etc-iscsi\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065250 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-sys\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065268 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-run\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065284 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz27s\" (UniqueName: \"kubernetes.io/projected/a34f4288-15d9-4ffc-8ff3-1a9f05001339-kube-api-access-lz27s\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065299 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-var-locks-brick\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065317 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34f4288-15d9-4ffc-8ff3-1a9f05001339-config-data\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065348 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-etc-nvme\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065372 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a34f4288-15d9-4ffc-8ff3-1a9f05001339-logs\") pod \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\" (UID: \"a34f4288-15d9-4ffc-8ff3-1a9f05001339\") " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065395 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065393 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a34f4288-15d9-4ffc-8ff3-1a9f05001339-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065442 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-dev" (OuterVolumeSpecName: "dev") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065462 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065607 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a34f4288-15d9-4ffc-8ff3-1a9f05001339-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065619 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065627 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065634 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065875 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065899 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-sys" (OuterVolumeSpecName: "sys") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065962 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-run" (OuterVolumeSpecName: "run") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.065985 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.071756 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a34f4288-15d9-4ffc-8ff3-1a9f05001339-kube-api-access-lz27s" (OuterVolumeSpecName: "kube-api-access-lz27s") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "kube-api-access-lz27s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.072753 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a34f4288-15d9-4ffc-8ff3-1a9f05001339-logs" (OuterVolumeSpecName: "logs") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.074144 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.079292 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.087281 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a34f4288-15d9-4ffc-8ff3-1a9f05001339-scripts" (OuterVolumeSpecName: "scripts") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.122409 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a34f4288-15d9-4ffc-8ff3-1a9f05001339-config-data" (OuterVolumeSpecName: "config-data") pod "a34f4288-15d9-4ffc-8ff3-1a9f05001339" (UID: "a34f4288-15d9-4ffc-8ff3-1a9f05001339"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.167882 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34f4288-15d9-4ffc-8ff3-1a9f05001339-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.167912 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.167921 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a34f4288-15d9-4ffc-8ff3-1a9f05001339-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.167946 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.167954 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a34f4288-15d9-4ffc-8ff3-1a9f05001339-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.167967 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.167977 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.167991 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.167999 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz27s\" (UniqueName: \"kubernetes.io/projected/a34f4288-15d9-4ffc-8ff3-1a9f05001339-kube-api-access-lz27s\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.168010 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a34f4288-15d9-4ffc-8ff3-1a9f05001339-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.189521 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.200441 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.269740 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.269991 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.466199 4689 generic.go:334] "Generic (PLEG): container finished" podID="a34f4288-15d9-4ffc-8ff3-1a9f05001339" containerID="14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107" exitCode=143 Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.466264 4689 generic.go:334] "Generic (PLEG): container finished" podID="a34f4288-15d9-4ffc-8ff3-1a9f05001339" containerID="fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed" exitCode=143 Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.466321 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.466407 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"a34f4288-15d9-4ffc-8ff3-1a9f05001339","Type":"ContainerDied","Data":"14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107"} Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.466447 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"a34f4288-15d9-4ffc-8ff3-1a9f05001339","Type":"ContainerDied","Data":"fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed"} Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.466468 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"a34f4288-15d9-4ffc-8ff3-1a9f05001339","Type":"ContainerDied","Data":"d7a32e76c442025d6158a867badffa6c1cb5153910fec12674050878bf288027"} Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.466486 4689 scope.go:117] "RemoveContainer" containerID="14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.497662 4689 scope.go:117] "RemoveContainer" containerID="fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.505425 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.517329 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.537438 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:43:08 crc kubenswrapper[4689]: E0307 04:43:08.538001 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34f4288-15d9-4ffc-8ff3-1a9f05001339" containerName="glance-log" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.538107 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34f4288-15d9-4ffc-8ff3-1a9f05001339" containerName="glance-log" Mar 07 04:43:08 crc kubenswrapper[4689]: E0307 04:43:08.538262 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34f4288-15d9-4ffc-8ff3-1a9f05001339" containerName="glance-httpd" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.538375 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34f4288-15d9-4ffc-8ff3-1a9f05001339" containerName="glance-httpd" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.538613 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="a34f4288-15d9-4ffc-8ff3-1a9f05001339" containerName="glance-log" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.538708 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="a34f4288-15d9-4ffc-8ff3-1a9f05001339" containerName="glance-httpd" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.539684 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.544586 4689 scope.go:117] "RemoveContainer" containerID="14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107" Mar 07 04:43:08 crc kubenswrapper[4689]: E0307 04:43:08.548261 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107\": container with ID starting with 14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107 not found: ID does not exist" containerID="14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.548301 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107"} err="failed to get container status \"14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107\": rpc error: code = NotFound desc = could not find container \"14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107\": container with ID starting with 14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107 not found: ID does not exist" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.548327 4689 scope.go:117] "RemoveContainer" containerID="fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.550242 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:43:08 crc kubenswrapper[4689]: E0307 04:43:08.550387 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed\": container with ID starting with fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed not found: ID does not exist" containerID="fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.550417 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed"} err="failed to get container status \"fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed\": rpc error: code = NotFound desc = could not find container \"fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed\": container with ID starting with fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed not found: ID does not exist" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.550437 4689 scope.go:117] "RemoveContainer" containerID="14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.550712 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107"} err="failed to get container status \"14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107\": rpc error: code = NotFound desc = could not find container \"14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107\": container with ID starting with 14fffd2c415ac0fad16fce60e6ce0ef8c49b0b0e95bf7dcc9dd9665b5ff53107 not found: ID does not exist" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.550745 4689 scope.go:117] "RemoveContainer" containerID="fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.551089 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed"} err="failed to get container status \"fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed\": rpc error: code = NotFound desc = could not find container \"fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed\": container with ID starting with fc4b35a18d1c345c2709326d088bb396d57ecc282f4108d2896304a057c051ed not found: ID does not exist" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.676735 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.676927 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.677002 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.677145 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a182ec7-71d1-41d1-adf9-6542525e21ae-logs\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.677272 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-sys\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.677324 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a182ec7-71d1-41d1-adf9-6542525e21ae-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.677401 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-dev\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.677436 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.677481 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-run\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.677577 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a182ec7-71d1-41d1-adf9-6542525e21ae-scripts\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.677635 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.677739 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a182ec7-71d1-41d1-adf9-6542525e21ae-config-data\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.677798 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.677980 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lb59\" (UniqueName: \"kubernetes.io/projected/4a182ec7-71d1-41d1-adf9-6542525e21ae-kube-api-access-6lb59\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.779919 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a182ec7-71d1-41d1-adf9-6542525e21ae-logs\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780007 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-sys\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780051 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a182ec7-71d1-41d1-adf9-6542525e21ae-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780092 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-dev\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780120 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780156 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-run\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780228 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a182ec7-71d1-41d1-adf9-6542525e21ae-scripts\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780282 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780312 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a182ec7-71d1-41d1-adf9-6542525e21ae-config-data\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780346 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780387 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lb59\" (UniqueName: \"kubernetes.io/projected/4a182ec7-71d1-41d1-adf9-6542525e21ae-kube-api-access-6lb59\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780418 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780466 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780488 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780637 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780768 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a182ec7-71d1-41d1-adf9-6542525e21ae-logs\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.780962 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.781982 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-dev\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.781980 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.782058 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.782143 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.782085 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-run\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.782296 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.782418 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a182ec7-71d1-41d1-adf9-6542525e21ae-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.782767 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-sys\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.786240 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a182ec7-71d1-41d1-adf9-6542525e21ae-scripts\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.789381 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a182ec7-71d1-41d1-adf9-6542525e21ae-config-data\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.823933 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lb59\" (UniqueName: \"kubernetes.io/projected/4a182ec7-71d1-41d1-adf9-6542525e21ae-kube-api-access-6lb59\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.827894 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.830419 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-1\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:08 crc kubenswrapper[4689]: I0307 04:43:08.858977 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:09 crc kubenswrapper[4689]: I0307 04:43:09.317122 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:43:09 crc kubenswrapper[4689]: I0307 04:43:09.478199 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"4a182ec7-71d1-41d1-adf9-6542525e21ae","Type":"ContainerStarted","Data":"f313aa05451a85b94ac42c66b9b1b9d04ca2deff836ad33e0b58853a71c5bad3"} Mar 07 04:43:09 crc kubenswrapper[4689]: I0307 04:43:09.478574 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"4a182ec7-71d1-41d1-adf9-6542525e21ae","Type":"ContainerStarted","Data":"2df411a5c761876db220ff4780ec9bc5c2defa5df86fad791be38e8c3d167c14"} Mar 07 04:43:09 crc kubenswrapper[4689]: I0307 04:43:09.841569 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a34f4288-15d9-4ffc-8ff3-1a9f05001339" path="/var/lib/kubelet/pods/a34f4288-15d9-4ffc-8ff3-1a9f05001339/volumes" Mar 07 04:43:10 crc kubenswrapper[4689]: I0307 04:43:10.499919 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"4a182ec7-71d1-41d1-adf9-6542525e21ae","Type":"ContainerStarted","Data":"6baca0a2e4217ab822d429ff118c88b5d02c5f346faed3cb206829e039d71856"} Mar 07 04:43:10 crc kubenswrapper[4689]: I0307 04:43:10.539819 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=2.539791197 podStartE2EDuration="2.539791197s" podCreationTimestamp="2026-03-07 04:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:43:10.532837399 +0000 UTC m=+1435.579220928" watchObservedRunningTime="2026-03-07 04:43:10.539791197 +0000 UTC m=+1435.586174726" Mar 07 04:43:15 crc kubenswrapper[4689]: I0307 04:43:15.842575 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:15 crc kubenswrapper[4689]: I0307 04:43:15.843262 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:15 crc kubenswrapper[4689]: I0307 04:43:15.888079 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:15 crc kubenswrapper[4689]: I0307 04:43:15.889302 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.047263 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.047334 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.100730 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.123027 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.542041 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.542230 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.562779 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.562845 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.562866 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.563000 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.607067 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.607874 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:16 crc kubenswrapper[4689]: I0307 04:43:16.627622 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:17 crc kubenswrapper[4689]: I0307 04:43:17.567955 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.287343 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.348467 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.566505 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.574241 4689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.575294 4689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.648024 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.697214 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.772471 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.776682 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.859808 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.859861 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.886274 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:18 crc kubenswrapper[4689]: I0307 04:43:18.943152 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:19 crc kubenswrapper[4689]: I0307 04:43:19.586968 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:19 crc kubenswrapper[4689]: I0307 04:43:19.587783 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:20 crc kubenswrapper[4689]: I0307 04:43:20.598047 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" containerName="glance-log" containerID="cri-o://ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d" gracePeriod=30 Mar 07 04:43:20 crc kubenswrapper[4689]: I0307 04:43:20.598958 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" containerName="glance-httpd" containerID="cri-o://5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7" gracePeriod=30 Mar 07 04:43:21 crc kubenswrapper[4689]: I0307 04:43:21.385594 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:21 crc kubenswrapper[4689]: I0307 04:43:21.471755 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:43:21 crc kubenswrapper[4689]: I0307 04:43:21.516085 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:43:21 crc kubenswrapper[4689]: I0307 04:43:21.606625 4689 generic.go:334] "Generic (PLEG): container finished" podID="8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" containerID="ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d" exitCode=143 Mar 07 04:43:21 crc kubenswrapper[4689]: I0307 04:43:21.606931 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="eea324bb-0c4d-4636-a821-07077ec1c6cf" containerName="glance-log" containerID="cri-o://4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d" gracePeriod=30 Mar 07 04:43:21 crc kubenswrapper[4689]: I0307 04:43:21.607037 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5","Type":"ContainerDied","Data":"ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d"} Mar 07 04:43:21 crc kubenswrapper[4689]: I0307 04:43:21.607851 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="eea324bb-0c4d-4636-a821-07077ec1c6cf" containerName="glance-httpd" containerID="cri-o://34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1" gracePeriod=30 Mar 07 04:43:22 crc kubenswrapper[4689]: I0307 04:43:22.621348 4689 generic.go:334] "Generic (PLEG): container finished" podID="eea324bb-0c4d-4636-a821-07077ec1c6cf" containerID="4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d" exitCode=143 Mar 07 04:43:22 crc kubenswrapper[4689]: I0307 04:43:22.621415 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"eea324bb-0c4d-4636-a821-07077ec1c6cf","Type":"ContainerDied","Data":"4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d"} Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.158627 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280357 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-logs\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280465 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-httpd-run\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280490 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-var-locks-brick\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280515 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-run\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280547 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-sys\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280574 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfp4v\" (UniqueName: \"kubernetes.io/projected/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-kube-api-access-jfp4v\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280625 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280659 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-etc-nvme\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280658 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280680 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-run" (OuterVolumeSpecName: "run") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280676 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-sys" (OuterVolumeSpecName: "sys") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280775 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-logs" (OuterVolumeSpecName: "logs") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280688 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-config-data\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280816 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280868 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-dev\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280897 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-lib-modules\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280923 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-etc-iscsi\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.280952 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-scripts\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.281062 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\" (UID: \"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5\") " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.281518 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.281538 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-dev" (OuterVolumeSpecName: "dev") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.281581 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.281846 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.281868 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.281895 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.281912 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.281927 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.281941 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.281956 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.281974 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.282001 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.286751 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-kube-api-access-jfp4v" (OuterVolumeSpecName: "kube-api-access-jfp4v") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "kube-api-access-jfp4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.288959 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance-cache") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.290933 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-scripts" (OuterVolumeSpecName: "scripts") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.291038 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.346459 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-config-data" (OuterVolumeSpecName: "config-data") pod "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" (UID: "8b31ffdb-e8a7-4b64-ac20-a8811a07abd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.383893 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.383976 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.383990 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.384007 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfp4v\" (UniqueName: \"kubernetes.io/projected/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-kube-api-access-jfp4v\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.384031 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.384043 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.400952 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.401020 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.485360 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.485583 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.646014 4689 generic.go:334] "Generic (PLEG): container finished" podID="8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" containerID="5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7" exitCode=0 Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.646079 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.646090 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5","Type":"ContainerDied","Data":"5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7"} Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.646209 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"8b31ffdb-e8a7-4b64-ac20-a8811a07abd5","Type":"ContainerDied","Data":"0ea09f2f4047efa72a4e4b231231f61e4cf74e46184c573fe1ac5e45379297d1"} Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.646256 4689 scope.go:117] "RemoveContainer" containerID="5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.686464 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.688705 4689 scope.go:117] "RemoveContainer" containerID="ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.698092 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.723353 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:43:24 crc kubenswrapper[4689]: E0307 04:43:24.723774 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" containerName="glance-httpd" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.723794 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" containerName="glance-httpd" Mar 07 04:43:24 crc kubenswrapper[4689]: E0307 04:43:24.723813 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" containerName="glance-log" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.723824 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" containerName="glance-log" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.724056 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" containerName="glance-httpd" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.724089 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" containerName="glance-log" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.731324 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.731561 4689 scope.go:117] "RemoveContainer" containerID="5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7" Mar 07 04:43:24 crc kubenswrapper[4689]: E0307 04:43:24.736316 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7\": container with ID starting with 5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7 not found: ID does not exist" containerID="5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.736380 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7"} err="failed to get container status \"5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7\": rpc error: code = NotFound desc = could not find container \"5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7\": container with ID starting with 5235e8ac2685e3ffce98a5e0c7abcde46eee13d9d3c7fbd5859d10f06fc395f7 not found: ID does not exist" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.736410 4689 scope.go:117] "RemoveContainer" containerID="ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d" Mar 07 04:43:24 crc kubenswrapper[4689]: E0307 04:43:24.743800 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d\": container with ID starting with ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d not found: ID does not exist" containerID="ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.743855 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d"} err="failed to get container status \"ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d\": rpc error: code = NotFound desc = could not find container \"ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d\": container with ID starting with ca1a3a951ab62509330df4b9d3d25084a00cc2b0504086ae3e2338fb3c98f78d not found: ID does not exist" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.749828 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.892235 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpw8f\" (UniqueName: \"kubernetes.io/projected/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-kube-api-access-vpw8f\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.892285 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-config-data\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.892318 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.892342 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.892529 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-sys\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.892694 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.893038 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.893218 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-dev\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.893316 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-logs\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.893402 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-scripts\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.893451 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.893479 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-run\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.893525 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.893654 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995241 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpw8f\" (UniqueName: \"kubernetes.io/projected/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-kube-api-access-vpw8f\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995283 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-config-data\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995307 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995323 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995349 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-sys\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995374 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995402 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995418 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-dev\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995436 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-logs\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995461 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-scripts\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995478 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995493 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-run\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995511 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995551 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995613 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995786 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995789 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-dev\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995847 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-run\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995889 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-sys\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995854 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.995933 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.996025 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.996041 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.996063 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-logs\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:24 crc kubenswrapper[4689]: I0307 04:43:24.996266 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:24.999995 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-scripts\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.005181 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-config-data\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.014277 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpw8f\" (UniqueName: \"kubernetes.io/projected/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-kube-api-access-vpw8f\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.022237 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.044110 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.070199 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.153139 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300271 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-run\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300325 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea324bb-0c4d-4636-a821-07077ec1c6cf-logs\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300355 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-etc-nvme\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300354 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-run" (OuterVolumeSpecName: "run") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300422 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300447 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea324bb-0c4d-4636-a821-07077ec1c6cf-scripts\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300468 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-sys\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300491 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea324bb-0c4d-4636-a821-07077ec1c6cf-config-data\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300523 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300554 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-var-locks-brick\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300570 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-etc-iscsi\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300592 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-lib-modules\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300612 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-dev\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300624 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eea324bb-0c4d-4636-a821-07077ec1c6cf-httpd-run\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300622 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300652 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44fmm\" (UniqueName: \"kubernetes.io/projected/eea324bb-0c4d-4636-a821-07077ec1c6cf-kube-api-access-44fmm\") pod \"eea324bb-0c4d-4636-a821-07077ec1c6cf\" (UID: \"eea324bb-0c4d-4636-a821-07077ec1c6cf\") " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300746 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-sys" (OuterVolumeSpecName: "sys") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300822 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300868 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300876 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-dev" (OuterVolumeSpecName: "dev") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.300886 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.301242 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea324bb-0c4d-4636-a821-07077ec1c6cf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.301365 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea324bb-0c4d-4636-a821-07077ec1c6cf-logs" (OuterVolumeSpecName: "logs") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.301517 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.301533 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.301542 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.301551 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eea324bb-0c4d-4636-a821-07077ec1c6cf-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.301559 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.301568 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.301576 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.301584 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eea324bb-0c4d-4636-a821-07077ec1c6cf-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.304292 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea324bb-0c4d-4636-a821-07077ec1c6cf-kube-api-access-44fmm" (OuterVolumeSpecName: "kube-api-access-44fmm") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "kube-api-access-44fmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.304695 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.305035 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea324bb-0c4d-4636-a821-07077ec1c6cf-scripts" (OuterVolumeSpecName: "scripts") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.305541 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.346368 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea324bb-0c4d-4636-a821-07077ec1c6cf-config-data" (OuterVolumeSpecName: "config-data") pod "eea324bb-0c4d-4636-a821-07077ec1c6cf" (UID: "eea324bb-0c4d-4636-a821-07077ec1c6cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.403059 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.403095 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea324bb-0c4d-4636-a821-07077ec1c6cf-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.403117 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea324bb-0c4d-4636-a821-07077ec1c6cf-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.403132 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.403144 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44fmm\" (UniqueName: \"kubernetes.io/projected/eea324bb-0c4d-4636-a821-07077ec1c6cf-kube-api-access-44fmm\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.403153 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea324bb-0c4d-4636-a821-07077ec1c6cf-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.426521 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.436906 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 07 04:43:25 crc kubenswrapper[4689]: W0307 04:43:25.500261 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf1432a0_b2f0_4c75_9fa6_9e3d5b1d16c5.slice/crio-a1532bb81bff1fcad71188b2e8b776666b91a586715db81b97783ee7814f0b20 WatchSource:0}: Error finding container a1532bb81bff1fcad71188b2e8b776666b91a586715db81b97783ee7814f0b20: Status 404 returned error can't find the container with id a1532bb81bff1fcad71188b2e8b776666b91a586715db81b97783ee7814f0b20 Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.503848 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.507891 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.508031 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.668389 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5","Type":"ContainerStarted","Data":"21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e"} Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.668471 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5","Type":"ContainerStarted","Data":"a1532bb81bff1fcad71188b2e8b776666b91a586715db81b97783ee7814f0b20"} Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.670631 4689 generic.go:334] "Generic (PLEG): container finished" podID="eea324bb-0c4d-4636-a821-07077ec1c6cf" containerID="34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1" exitCode=0 Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.670664 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.670705 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"eea324bb-0c4d-4636-a821-07077ec1c6cf","Type":"ContainerDied","Data":"34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1"} Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.670768 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"eea324bb-0c4d-4636-a821-07077ec1c6cf","Type":"ContainerDied","Data":"f526f1ea024599ad5c40a21b77bcb68d705bb7e332d6f79c4ef6ec8f4728a0b9"} Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.670822 4689 scope.go:117] "RemoveContainer" containerID="34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.709255 4689 scope.go:117] "RemoveContainer" containerID="4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.719414 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.727828 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.760343 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:43:25 crc kubenswrapper[4689]: E0307 04:43:25.760701 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea324bb-0c4d-4636-a821-07077ec1c6cf" containerName="glance-log" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.760719 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea324bb-0c4d-4636-a821-07077ec1c6cf" containerName="glance-log" Mar 07 04:43:25 crc kubenswrapper[4689]: E0307 04:43:25.760746 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea324bb-0c4d-4636-a821-07077ec1c6cf" containerName="glance-httpd" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.760755 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea324bb-0c4d-4636-a821-07077ec1c6cf" containerName="glance-httpd" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.760904 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea324bb-0c4d-4636-a821-07077ec1c6cf" containerName="glance-httpd" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.760930 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea324bb-0c4d-4636-a821-07077ec1c6cf" containerName="glance-log" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.761819 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.772605 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.775619 4689 scope.go:117] "RemoveContainer" containerID="34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1" Mar 07 04:43:25 crc kubenswrapper[4689]: E0307 04:43:25.776052 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1\": container with ID starting with 34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1 not found: ID does not exist" containerID="34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.776091 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1"} err="failed to get container status \"34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1\": rpc error: code = NotFound desc = could not find container \"34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1\": container with ID starting with 34260964d819acf86b7770ff3e49e62ce078d155b1d41723b96ac39163eb16f1 not found: ID does not exist" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.776115 4689 scope.go:117] "RemoveContainer" containerID="4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d" Mar 07 04:43:25 crc kubenswrapper[4689]: E0307 04:43:25.776601 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d\": container with ID starting with 4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d not found: ID does not exist" containerID="4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.776627 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d"} err="failed to get container status \"4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d\": rpc error: code = NotFound desc = could not find container \"4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d\": container with ID starting with 4490b0101b697827096769a733c3f37750aea98be5d1081769dfc375b0407d7d not found: ID does not exist" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.835839 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b31ffdb-e8a7-4b64-ac20-a8811a07abd5" path="/var/lib/kubelet/pods/8b31ffdb-e8a7-4b64-ac20-a8811a07abd5/volumes" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.836529 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea324bb-0c4d-4636-a821-07077ec1c6cf" path="/var/lib/kubelet/pods/eea324bb-0c4d-4636-a821-07077ec1c6cf/volumes" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914111 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-dev\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914217 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914272 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql6jt\" (UniqueName: \"kubernetes.io/projected/c34a0280-1ca8-411a-9a20-77028dfad0fd-kube-api-access-ql6jt\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914376 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914536 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c34a0280-1ca8-411a-9a20-77028dfad0fd-logs\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914589 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c34a0280-1ca8-411a-9a20-77028dfad0fd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914625 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-run\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914665 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c34a0280-1ca8-411a-9a20-77028dfad0fd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914695 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914757 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914802 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914832 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-sys\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914908 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:25 crc kubenswrapper[4689]: I0307 04:43:25.914946 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c34a0280-1ca8-411a-9a20-77028dfad0fd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.016858 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c34a0280-1ca8-411a-9a20-77028dfad0fd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017072 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-run\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017103 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c34a0280-1ca8-411a-9a20-77028dfad0fd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017156 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017218 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017321 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017273 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017246 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-run\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017326 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017347 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017424 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-sys\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017512 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c34a0280-1ca8-411a-9a20-77028dfad0fd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017525 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017567 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c34a0280-1ca8-411a-9a20-77028dfad0fd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017510 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-sys\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017692 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-dev\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017716 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017746 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql6jt\" (UniqueName: \"kubernetes.io/projected/c34a0280-1ca8-411a-9a20-77028dfad0fd-kube-api-access-ql6jt\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017767 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017835 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017850 4689 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017775 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-dev\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.017775 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.018133 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c34a0280-1ca8-411a-9a20-77028dfad0fd-logs\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.018672 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c34a0280-1ca8-411a-9a20-77028dfad0fd-logs\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.023087 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c34a0280-1ca8-411a-9a20-77028dfad0fd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.028309 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c34a0280-1ca8-411a-9a20-77028dfad0fd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.039483 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql6jt\" (UniqueName: \"kubernetes.io/projected/c34a0280-1ca8-411a-9a20-77028dfad0fd-kube-api-access-ql6jt\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.040919 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.046833 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.093306 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.610302 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.734146 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5","Type":"ContainerStarted","Data":"2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff"} Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.735587 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c34a0280-1ca8-411a-9a20-77028dfad0fd","Type":"ContainerStarted","Data":"d0f94fa5b49bb7271c458f81dcaaf866f8a18964513fee6ccc9707792cb9d715"} Mar 07 04:43:26 crc kubenswrapper[4689]: I0307 04:43:26.768600 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.76858112 podStartE2EDuration="2.76858112s" podCreationTimestamp="2026-03-07 04:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:43:26.756987577 +0000 UTC m=+1451.803371076" watchObservedRunningTime="2026-03-07 04:43:26.76858112 +0000 UTC m=+1451.814964609" Mar 07 04:43:27 crc kubenswrapper[4689]: I0307 04:43:27.745763 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c34a0280-1ca8-411a-9a20-77028dfad0fd","Type":"ContainerStarted","Data":"0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9"} Mar 07 04:43:27 crc kubenswrapper[4689]: I0307 04:43:27.746318 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c34a0280-1ca8-411a-9a20-77028dfad0fd","Type":"ContainerStarted","Data":"b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e"} Mar 07 04:43:27 crc kubenswrapper[4689]: I0307 04:43:27.767200 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.767157752 podStartE2EDuration="2.767157752s" podCreationTimestamp="2026-03-07 04:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:43:27.762137346 +0000 UTC m=+1452.808520845" watchObservedRunningTime="2026-03-07 04:43:27.767157752 +0000 UTC m=+1452.813541241" Mar 07 04:43:35 crc kubenswrapper[4689]: I0307 04:43:35.071898 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:35 crc kubenswrapper[4689]: I0307 04:43:35.072526 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:35 crc kubenswrapper[4689]: I0307 04:43:35.104699 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:35 crc kubenswrapper[4689]: I0307 04:43:35.128063 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:35 crc kubenswrapper[4689]: I0307 04:43:35.819127 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:35 crc kubenswrapper[4689]: I0307 04:43:35.819177 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:36 crc kubenswrapper[4689]: I0307 04:43:36.094156 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:36 crc kubenswrapper[4689]: I0307 04:43:36.095137 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:36 crc kubenswrapper[4689]: I0307 04:43:36.121486 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:36 crc kubenswrapper[4689]: I0307 04:43:36.153336 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:36 crc kubenswrapper[4689]: I0307 04:43:36.847151 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:36 crc kubenswrapper[4689]: I0307 04:43:36.847203 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:37 crc kubenswrapper[4689]: I0307 04:43:37.626658 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:37 crc kubenswrapper[4689]: I0307 04:43:37.753803 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:43:38 crc kubenswrapper[4689]: I0307 04:43:38.663696 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:38 crc kubenswrapper[4689]: I0307 04:43:38.697608 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:43:51 crc kubenswrapper[4689]: I0307 04:43:51.646699 4689 scope.go:117] "RemoveContainer" containerID="f698e75ad978cfba6d54756f60d78533f9e190c43a21361e46c8f523e46705bd" Mar 07 04:43:51 crc kubenswrapper[4689]: I0307 04:43:51.676857 4689 scope.go:117] "RemoveContainer" containerID="f4786452fdcaa7ad2115ba9bb4423fe6f9cdce0e73e522aa6b1b0cd9da170b25" Mar 07 04:44:00 crc kubenswrapper[4689]: I0307 04:44:00.149894 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547644-s9mfn"] Mar 07 04:44:00 crc kubenswrapper[4689]: I0307 04:44:00.154211 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547644-s9mfn" Mar 07 04:44:00 crc kubenswrapper[4689]: I0307 04:44:00.158690 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:44:00 crc kubenswrapper[4689]: I0307 04:44:00.159356 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:44:00 crc kubenswrapper[4689]: I0307 04:44:00.159535 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:44:00 crc kubenswrapper[4689]: I0307 04:44:00.165441 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547644-s9mfn"] Mar 07 04:44:00 crc kubenswrapper[4689]: I0307 04:44:00.239628 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8z7t\" (UniqueName: \"kubernetes.io/projected/ba3603b5-9735-4260-8976-8589b3013d8d-kube-api-access-q8z7t\") pod \"auto-csr-approver-29547644-s9mfn\" (UID: \"ba3603b5-9735-4260-8976-8589b3013d8d\") " pod="openshift-infra/auto-csr-approver-29547644-s9mfn" Mar 07 04:44:00 crc kubenswrapper[4689]: I0307 04:44:00.341493 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8z7t\" (UniqueName: \"kubernetes.io/projected/ba3603b5-9735-4260-8976-8589b3013d8d-kube-api-access-q8z7t\") pod \"auto-csr-approver-29547644-s9mfn\" (UID: \"ba3603b5-9735-4260-8976-8589b3013d8d\") " pod="openshift-infra/auto-csr-approver-29547644-s9mfn" Mar 07 04:44:00 crc kubenswrapper[4689]: I0307 04:44:00.370923 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8z7t\" (UniqueName: \"kubernetes.io/projected/ba3603b5-9735-4260-8976-8589b3013d8d-kube-api-access-q8z7t\") pod \"auto-csr-approver-29547644-s9mfn\" (UID: \"ba3603b5-9735-4260-8976-8589b3013d8d\") " pod="openshift-infra/auto-csr-approver-29547644-s9mfn" Mar 07 04:44:00 crc kubenswrapper[4689]: I0307 04:44:00.483837 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547644-s9mfn" Mar 07 04:44:00 crc kubenswrapper[4689]: I0307 04:44:00.953676 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547644-s9mfn"] Mar 07 04:44:01 crc kubenswrapper[4689]: I0307 04:44:01.061533 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547644-s9mfn" event={"ID":"ba3603b5-9735-4260-8976-8589b3013d8d","Type":"ContainerStarted","Data":"863e3a537c248c11c69f1cbca42a38ef27398fb368ac97c2e6a1afdcaeb8e92d"} Mar 07 04:44:03 crc kubenswrapper[4689]: I0307 04:44:03.078932 4689 generic.go:334] "Generic (PLEG): container finished" podID="ba3603b5-9735-4260-8976-8589b3013d8d" containerID="301c5d60131d107d73acd8f94df46d4c74660137febf8822678d498d3d23e1af" exitCode=0 Mar 07 04:44:03 crc kubenswrapper[4689]: I0307 04:44:03.079032 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547644-s9mfn" event={"ID":"ba3603b5-9735-4260-8976-8589b3013d8d","Type":"ContainerDied","Data":"301c5d60131d107d73acd8f94df46d4c74660137febf8822678d498d3d23e1af"} Mar 07 04:44:04 crc kubenswrapper[4689]: I0307 04:44:04.390672 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547644-s9mfn" Mar 07 04:44:04 crc kubenswrapper[4689]: I0307 04:44:04.501394 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8z7t\" (UniqueName: \"kubernetes.io/projected/ba3603b5-9735-4260-8976-8589b3013d8d-kube-api-access-q8z7t\") pod \"ba3603b5-9735-4260-8976-8589b3013d8d\" (UID: \"ba3603b5-9735-4260-8976-8589b3013d8d\") " Mar 07 04:44:04 crc kubenswrapper[4689]: I0307 04:44:04.514383 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3603b5-9735-4260-8976-8589b3013d8d-kube-api-access-q8z7t" (OuterVolumeSpecName: "kube-api-access-q8z7t") pod "ba3603b5-9735-4260-8976-8589b3013d8d" (UID: "ba3603b5-9735-4260-8976-8589b3013d8d"). InnerVolumeSpecName "kube-api-access-q8z7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:04 crc kubenswrapper[4689]: I0307 04:44:04.603448 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8z7t\" (UniqueName: \"kubernetes.io/projected/ba3603b5-9735-4260-8976-8589b3013d8d-kube-api-access-q8z7t\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:05 crc kubenswrapper[4689]: I0307 04:44:05.104477 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547644-s9mfn" event={"ID":"ba3603b5-9735-4260-8976-8589b3013d8d","Type":"ContainerDied","Data":"863e3a537c248c11c69f1cbca42a38ef27398fb368ac97c2e6a1afdcaeb8e92d"} Mar 07 04:44:05 crc kubenswrapper[4689]: I0307 04:44:05.104551 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547644-s9mfn" Mar 07 04:44:05 crc kubenswrapper[4689]: I0307 04:44:05.104578 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="863e3a537c248c11c69f1cbca42a38ef27398fb368ac97c2e6a1afdcaeb8e92d" Mar 07 04:44:05 crc kubenswrapper[4689]: I0307 04:44:05.459771 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547638-ztptc"] Mar 07 04:44:05 crc kubenswrapper[4689]: I0307 04:44:05.465844 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547638-ztptc"] Mar 07 04:44:05 crc kubenswrapper[4689]: I0307 04:44:05.836562 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5dab47b-a90b-4452-9afa-73e38014b1a5" path="/var/lib/kubelet/pods/d5dab47b-a90b-4452-9afa-73e38014b1a5/volumes" Mar 07 04:44:21 crc kubenswrapper[4689]: I0307 04:44:21.911707 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:44:21 crc kubenswrapper[4689]: I0307 04:44:21.912829 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="7431fc02-96c1-4a55-aad6-83c23610f7a0" containerName="glance-log" containerID="cri-o://199a67cd230a4ed0590560713085f31003749707b102262e412decacd3bc5f8a" gracePeriod=30 Mar 07 04:44:21 crc kubenswrapper[4689]: I0307 04:44:21.912965 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="7431fc02-96c1-4a55-aad6-83c23610f7a0" containerName="glance-httpd" containerID="cri-o://15dd62250ce33307a42451ba46c49b5fd8fb925435c7bac5008c462104124a1e" gracePeriod=30 Mar 07 04:44:22 crc kubenswrapper[4689]: I0307 04:44:22.063108 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:44:22 crc kubenswrapper[4689]: I0307 04:44:22.063889 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="4a182ec7-71d1-41d1-adf9-6542525e21ae" containerName="glance-log" containerID="cri-o://f313aa05451a85b94ac42c66b9b1b9d04ca2deff836ad33e0b58853a71c5bad3" gracePeriod=30 Mar 07 04:44:22 crc kubenswrapper[4689]: I0307 04:44:22.063986 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="4a182ec7-71d1-41d1-adf9-6542525e21ae" containerName="glance-httpd" containerID="cri-o://6baca0a2e4217ab822d429ff118c88b5d02c5f346faed3cb206829e039d71856" gracePeriod=30 Mar 07 04:44:22 crc kubenswrapper[4689]: I0307 04:44:22.279188 4689 generic.go:334] "Generic (PLEG): container finished" podID="4a182ec7-71d1-41d1-adf9-6542525e21ae" containerID="f313aa05451a85b94ac42c66b9b1b9d04ca2deff836ad33e0b58853a71c5bad3" exitCode=143 Mar 07 04:44:22 crc kubenswrapper[4689]: I0307 04:44:22.279286 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"4a182ec7-71d1-41d1-adf9-6542525e21ae","Type":"ContainerDied","Data":"f313aa05451a85b94ac42c66b9b1b9d04ca2deff836ad33e0b58853a71c5bad3"} Mar 07 04:44:22 crc kubenswrapper[4689]: I0307 04:44:22.281155 4689 generic.go:334] "Generic (PLEG): container finished" podID="7431fc02-96c1-4a55-aad6-83c23610f7a0" containerID="199a67cd230a4ed0590560713085f31003749707b102262e412decacd3bc5f8a" exitCode=143 Mar 07 04:44:22 crc kubenswrapper[4689]: I0307 04:44:22.281217 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"7431fc02-96c1-4a55-aad6-83c23610f7a0","Type":"ContainerDied","Data":"199a67cd230a4ed0590560713085f31003749707b102262e412decacd3bc5f8a"} Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.272298 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-sjcmd"] Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.277591 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-sjcmd"] Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.339541 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance1dad-account-delete-zpt88"] Mar 07 04:44:23 crc kubenswrapper[4689]: E0307 04:44:23.339873 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3603b5-9735-4260-8976-8589b3013d8d" containerName="oc" Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.339888 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3603b5-9735-4260-8976-8589b3013d8d" containerName="oc" Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.340040 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3603b5-9735-4260-8976-8589b3013d8d" containerName="oc" Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.340527 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.353645 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.353867 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="c34a0280-1ca8-411a-9a20-77028dfad0fd" containerName="glance-log" containerID="cri-o://b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e" gracePeriod=30 Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.353969 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="c34a0280-1ca8-411a-9a20-77028dfad0fd" containerName="glance-httpd" containerID="cri-o://0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9" gracePeriod=30 Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.366303 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance1dad-account-delete-zpt88"] Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.418188 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.418488 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" containerName="glance-log" containerID="cri-o://21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e" gracePeriod=30 Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.418788 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" containerName="glance-httpd" containerID="cri-o://2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff" gracePeriod=30 Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.525181 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd25307b-477c-49d2-a59c-b492f873be79-operator-scripts\") pod \"glance1dad-account-delete-zpt88\" (UID: \"cd25307b-477c-49d2-a59c-b492f873be79\") " pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.525363 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pkcb\" (UniqueName: \"kubernetes.io/projected/cd25307b-477c-49d2-a59c-b492f873be79-kube-api-access-2pkcb\") pod \"glance1dad-account-delete-zpt88\" (UID: \"cd25307b-477c-49d2-a59c-b492f873be79\") " pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.626667 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pkcb\" (UniqueName: \"kubernetes.io/projected/cd25307b-477c-49d2-a59c-b492f873be79-kube-api-access-2pkcb\") pod \"glance1dad-account-delete-zpt88\" (UID: \"cd25307b-477c-49d2-a59c-b492f873be79\") " pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.626793 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd25307b-477c-49d2-a59c-b492f873be79-operator-scripts\") pod \"glance1dad-account-delete-zpt88\" (UID: \"cd25307b-477c-49d2-a59c-b492f873be79\") " pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.627456 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd25307b-477c-49d2-a59c-b492f873be79-operator-scripts\") pod \"glance1dad-account-delete-zpt88\" (UID: \"cd25307b-477c-49d2-a59c-b492f873be79\") " pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.651749 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pkcb\" (UniqueName: \"kubernetes.io/projected/cd25307b-477c-49d2-a59c-b492f873be79-kube-api-access-2pkcb\") pod \"glance1dad-account-delete-zpt88\" (UID: \"cd25307b-477c-49d2-a59c-b492f873be79\") " pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.658631 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" Mar 07 04:44:23 crc kubenswrapper[4689]: I0307 04:44:23.836428 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6740dbc-96ca-41c6-865d-c2c2e195c954" path="/var/lib/kubelet/pods/e6740dbc-96ca-41c6-865d-c2c2e195c954/volumes" Mar 07 04:44:24 crc kubenswrapper[4689]: I0307 04:44:24.130355 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance1dad-account-delete-zpt88"] Mar 07 04:44:24 crc kubenswrapper[4689]: I0307 04:44:24.299251 4689 generic.go:334] "Generic (PLEG): container finished" podID="c34a0280-1ca8-411a-9a20-77028dfad0fd" containerID="b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e" exitCode=143 Mar 07 04:44:24 crc kubenswrapper[4689]: I0307 04:44:24.299330 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c34a0280-1ca8-411a-9a20-77028dfad0fd","Type":"ContainerDied","Data":"b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e"} Mar 07 04:44:24 crc kubenswrapper[4689]: I0307 04:44:24.300797 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" event={"ID":"cd25307b-477c-49d2-a59c-b492f873be79","Type":"ContainerStarted","Data":"ddcf9a57f6ae602aa55f457de587e1eb15f5211939b9ceb36d8b2bb7207f1422"} Mar 07 04:44:24 crc kubenswrapper[4689]: I0307 04:44:24.300856 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" event={"ID":"cd25307b-477c-49d2-a59c-b492f873be79","Type":"ContainerStarted","Data":"419234ef6de02bd19df46738ba5cd433383268297794d644662d62fe47eb5860"} Mar 07 04:44:24 crc kubenswrapper[4689]: I0307 04:44:24.303993 4689 generic.go:334] "Generic (PLEG): container finished" podID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" containerID="21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e" exitCode=143 Mar 07 04:44:24 crc kubenswrapper[4689]: I0307 04:44:24.304067 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5","Type":"ContainerDied","Data":"21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e"} Mar 07 04:44:24 crc kubenswrapper[4689]: I0307 04:44:24.328093 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" podStartSLOduration=1.328061936 podStartE2EDuration="1.328061936s" podCreationTimestamp="2026-03-07 04:44:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:44:24.320062041 +0000 UTC m=+1509.366445570" watchObservedRunningTime="2026-03-07 04:44:24.328061936 +0000 UTC m=+1509.374445465" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.319607 4689 generic.go:334] "Generic (PLEG): container finished" podID="7431fc02-96c1-4a55-aad6-83c23610f7a0" containerID="15dd62250ce33307a42451ba46c49b5fd8fb925435c7bac5008c462104124a1e" exitCode=0 Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.319831 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"7431fc02-96c1-4a55-aad6-83c23610f7a0","Type":"ContainerDied","Data":"15dd62250ce33307a42451ba46c49b5fd8fb925435c7bac5008c462104124a1e"} Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.328492 4689 generic.go:334] "Generic (PLEG): container finished" podID="cd25307b-477c-49d2-a59c-b492f873be79" containerID="ddcf9a57f6ae602aa55f457de587e1eb15f5211939b9ceb36d8b2bb7207f1422" exitCode=0 Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.328598 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" event={"ID":"cd25307b-477c-49d2-a59c-b492f873be79","Type":"ContainerDied","Data":"ddcf9a57f6ae602aa55f457de587e1eb15f5211939b9ceb36d8b2bb7207f1422"} Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.330514 4689 generic.go:334] "Generic (PLEG): container finished" podID="4a182ec7-71d1-41d1-adf9-6542525e21ae" containerID="6baca0a2e4217ab822d429ff118c88b5d02c5f346faed3cb206829e039d71856" exitCode=0 Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.330547 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"4a182ec7-71d1-41d1-adf9-6542525e21ae","Type":"ContainerDied","Data":"6baca0a2e4217ab822d429ff118c88b5d02c5f346faed3cb206829e039d71856"} Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.572271 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.643584 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.666636 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-sys\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.667039 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-etc-nvme\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.667308 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-run\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.667756 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7431fc02-96c1-4a55-aad6-83c23610f7a0-httpd-run\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.667791 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-etc-iscsi\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.666836 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-sys" (OuterVolumeSpecName: "sys") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668021 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668094 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668131 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-run" (OuterVolumeSpecName: "run") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668236 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7431fc02-96c1-4a55-aad6-83c23610f7a0-logs\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668289 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7431fc02-96c1-4a55-aad6-83c23610f7a0-scripts\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668318 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-dev\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668372 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt8dh\" (UniqueName: \"kubernetes.io/projected/7431fc02-96c1-4a55-aad6-83c23610f7a0-kube-api-access-mt8dh\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668410 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668431 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-lib-modules\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668483 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7431fc02-96c1-4a55-aad6-83c23610f7a0-config-data\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668545 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668575 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-var-locks-brick\") pod \"7431fc02-96c1-4a55-aad6-83c23610f7a0\" (UID: \"7431fc02-96c1-4a55-aad6-83c23610f7a0\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.669075 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.669090 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.668538 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7431fc02-96c1-4a55-aad6-83c23610f7a0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.669126 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.669461 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-dev" (OuterVolumeSpecName: "dev") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.669529 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.669986 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7431fc02-96c1-4a55-aad6-83c23610f7a0-logs" (OuterVolumeSpecName: "logs") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.673492 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.674280 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.674461 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7431fc02-96c1-4a55-aad6-83c23610f7a0-kube-api-access-mt8dh" (OuterVolumeSpecName: "kube-api-access-mt8dh") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "kube-api-access-mt8dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.681903 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7431fc02-96c1-4a55-aad6-83c23610f7a0-scripts" (OuterVolumeSpecName: "scripts") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.724982 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7431fc02-96c1-4a55-aad6-83c23610f7a0-config-data" (OuterVolumeSpecName: "config-data") pod "7431fc02-96c1-4a55-aad6-83c23610f7a0" (UID: "7431fc02-96c1-4a55-aad6-83c23610f7a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.769771 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lb59\" (UniqueName: \"kubernetes.io/projected/4a182ec7-71d1-41d1-adf9-6542525e21ae-kube-api-access-6lb59\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.769909 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.769947 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-etc-nvme\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.769979 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a182ec7-71d1-41d1-adf9-6542525e21ae-logs\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770009 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a182ec7-71d1-41d1-adf9-6542525e21ae-config-data\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770552 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a182ec7-71d1-41d1-adf9-6542525e21ae-httpd-run\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770121 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770579 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-run\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770688 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-dev\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770392 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a182ec7-71d1-41d1-adf9-6542525e21ae-logs" (OuterVolumeSpecName: "logs") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770638 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-run" (OuterVolumeSpecName: "run") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770725 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770756 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-dev" (OuterVolumeSpecName: "dev") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770760 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-var-locks-brick\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770784 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770826 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-lib-modules\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770862 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-etc-iscsi\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770907 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770933 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770945 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a182ec7-71d1-41d1-adf9-6542525e21ae-scripts\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.770993 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-sys\") pod \"4a182ec7-71d1-41d1-adf9-6542525e21ae\" (UID: \"4a182ec7-71d1-41d1-adf9-6542525e21ae\") " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771214 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a182ec7-71d1-41d1-adf9-6542525e21ae-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771240 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-sys" (OuterVolumeSpecName: "sys") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771475 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771534 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771552 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771567 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771581 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7431fc02-96c1-4a55-aad6-83c23610f7a0-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771594 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7431fc02-96c1-4a55-aad6-83c23610f7a0-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771608 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7431fc02-96c1-4a55-aad6-83c23610f7a0-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771621 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771634 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a182ec7-71d1-41d1-adf9-6542525e21ae-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771648 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771661 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a182ec7-71d1-41d1-adf9-6542525e21ae-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771676 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt8dh\" (UniqueName: \"kubernetes.io/projected/7431fc02-96c1-4a55-aad6-83c23610f7a0-kube-api-access-mt8dh\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771690 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771719 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771735 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771750 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7431fc02-96c1-4a55-aad6-83c23610f7a0-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771764 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771778 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7431fc02-96c1-4a55-aad6-83c23610f7a0-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771792 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771805 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a182ec7-71d1-41d1-adf9-6542525e21ae-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.771828 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.773790 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.774249 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a182ec7-71d1-41d1-adf9-6542525e21ae-scripts" (OuterVolumeSpecName: "scripts") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.774252 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.775791 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a182ec7-71d1-41d1-adf9-6542525e21ae-kube-api-access-6lb59" (OuterVolumeSpecName: "kube-api-access-6lb59") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "kube-api-access-6lb59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.786114 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.790784 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.831918 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a182ec7-71d1-41d1-adf9-6542525e21ae-config-data" (OuterVolumeSpecName: "config-data") pod "4a182ec7-71d1-41d1-adf9-6542525e21ae" (UID: "4a182ec7-71d1-41d1-adf9-6542525e21ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.873811 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.873841 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a182ec7-71d1-41d1-adf9-6542525e21ae-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.873852 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.873864 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.873875 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a182ec7-71d1-41d1-adf9-6542525e21ae-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.873883 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.873892 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lb59\" (UniqueName: \"kubernetes.io/projected/4a182ec7-71d1-41d1-adf9-6542525e21ae-kube-api-access-6lb59\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.886531 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.898299 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.975715 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:25 crc kubenswrapper[4689]: I0307 04:44:25.975771 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.342162 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"7431fc02-96c1-4a55-aad6-83c23610f7a0","Type":"ContainerDied","Data":"e34652d4c3bab94f17b2708c2978a717754804203f62e845d54ae15d85fac95b"} Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.342269 4689 scope.go:117] "RemoveContainer" containerID="15dd62250ce33307a42451ba46c49b5fd8fb925435c7bac5008c462104124a1e" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.342307 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.348232 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.348807 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"4a182ec7-71d1-41d1-adf9-6542525e21ae","Type":"ContainerDied","Data":"2df411a5c761876db220ff4780ec9bc5c2defa5df86fad791be38e8c3d167c14"} Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.394053 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.399137 4689 scope.go:117] "RemoveContainer" containerID="199a67cd230a4ed0590560713085f31003749707b102262e412decacd3bc5f8a" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.405343 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.411946 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.418202 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.429701 4689 scope.go:117] "RemoveContainer" containerID="6baca0a2e4217ab822d429ff118c88b5d02c5f346faed3cb206829e039d71856" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.468943 4689 scope.go:117] "RemoveContainer" containerID="f313aa05451a85b94ac42c66b9b1b9d04ca2deff836ad33e0b58853a71c5bad3" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.548749 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="c34a0280-1ca8-411a-9a20-77028dfad0fd" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.158:9292/healthcheck\": read tcp 10.217.0.2:35230->10.217.0.158:9292: read: connection reset by peer" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.548761 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="c34a0280-1ca8-411a-9a20-77028dfad0fd" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.158:9292/healthcheck\": read tcp 10.217.0.2:35226->10.217.0.158:9292: read: connection reset by peer" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.586876 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.157:9292/healthcheck\": read tcp 10.217.0.2:49436->10.217.0.157:9292: read: connection reset by peer" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.586913 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.157:9292/healthcheck\": read tcp 10.217.0.2:49428->10.217.0.157:9292: read: connection reset by peer" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.757695 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.890422 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pkcb\" (UniqueName: \"kubernetes.io/projected/cd25307b-477c-49d2-a59c-b492f873be79-kube-api-access-2pkcb\") pod \"cd25307b-477c-49d2-a59c-b492f873be79\" (UID: \"cd25307b-477c-49d2-a59c-b492f873be79\") " Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.890840 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd25307b-477c-49d2-a59c-b492f873be79-operator-scripts\") pod \"cd25307b-477c-49d2-a59c-b492f873be79\" (UID: \"cd25307b-477c-49d2-a59c-b492f873be79\") " Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.892143 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd25307b-477c-49d2-a59c-b492f873be79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd25307b-477c-49d2-a59c-b492f873be79" (UID: "cd25307b-477c-49d2-a59c-b492f873be79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.895788 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd25307b-477c-49d2-a59c-b492f873be79-kube-api-access-2pkcb" (OuterVolumeSpecName: "kube-api-access-2pkcb") pod "cd25307b-477c-49d2-a59c-b492f873be79" (UID: "cd25307b-477c-49d2-a59c-b492f873be79"). InnerVolumeSpecName "kube-api-access-2pkcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.935491 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.940508 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.993484 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd25307b-477c-49d2-a59c-b492f873be79-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:26 crc kubenswrapper[4689]: I0307 04:44:26.993539 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pkcb\" (UniqueName: \"kubernetes.io/projected/cd25307b-477c-49d2-a59c-b492f873be79-kube-api-access-2pkcb\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.094775 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.095111 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.095260 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-sys\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.095484 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c34a0280-1ca8-411a-9a20-77028dfad0fd-config-data\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.095584 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-etc-nvme\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.095687 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-httpd-run\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.095771 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.095847 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-run\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.095930 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c34a0280-1ca8-411a-9a20-77028dfad0fd-httpd-run\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096018 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-var-locks-brick\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096092 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-run\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096214 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-lib-modules\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096291 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-etc-iscsi\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096385 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-config-data\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096457 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-scripts\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096539 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-var-locks-brick\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096621 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096694 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-dev\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096762 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpw8f\" (UniqueName: \"kubernetes.io/projected/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-kube-api-access-vpw8f\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096838 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c34a0280-1ca8-411a-9a20-77028dfad0fd-scripts\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096972 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql6jt\" (UniqueName: \"kubernetes.io/projected/c34a0280-1ca8-411a-9a20-77028dfad0fd-kube-api-access-ql6jt\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.097064 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-sys\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.097148 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-etc-nvme\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.097276 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c34a0280-1ca8-411a-9a20-77028dfad0fd-logs\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.097348 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-logs\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.097412 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-etc-iscsi\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.097559 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-lib-modules\") pod \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\" (UID: \"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.097694 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-dev\") pod \"c34a0280-1ca8-411a-9a20-77028dfad0fd\" (UID: \"c34a0280-1ca8-411a-9a20-77028dfad0fd\") " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.095590 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-sys" (OuterVolumeSpecName: "sys") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.099323 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096369 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-run" (OuterVolumeSpecName: "run") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096410 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096448 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096475 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-run" (OuterVolumeSpecName: "run") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096478 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096824 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096843 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096871 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-dev" (OuterVolumeSpecName: "dev") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.096877 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.098035 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c34a0280-1ca8-411a-9a20-77028dfad0fd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.098222 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-dev" (OuterVolumeSpecName: "dev") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.099217 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-logs" (OuterVolumeSpecName: "logs") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.099238 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.099253 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.099250 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c34a0280-1ca8-411a-9a20-77028dfad0fd-logs" (OuterVolumeSpecName: "logs") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.099345 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.099802 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance-cache") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.100267 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-sys" (OuterVolumeSpecName: "sys") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.100497 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-kube-api-access-vpw8f" (OuterVolumeSpecName: "kube-api-access-vpw8f") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "kube-api-access-vpw8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.100543 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.102291 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c34a0280-1ca8-411a-9a20-77028dfad0fd-scripts" (OuterVolumeSpecName: "scripts") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.102320 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c34a0280-1ca8-411a-9a20-77028dfad0fd-kube-api-access-ql6jt" (OuterVolumeSpecName: "kube-api-access-ql6jt") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "kube-api-access-ql6jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.103460 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.103739 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-scripts" (OuterVolumeSpecName: "scripts") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.150571 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-config-data" (OuterVolumeSpecName: "config-data") pod "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" (UID: "df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.162510 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c34a0280-1ca8-411a-9a20-77028dfad0fd-config-data" (OuterVolumeSpecName: "config-data") pod "c34a0280-1ca8-411a-9a20-77028dfad0fd" (UID: "c34a0280-1ca8-411a-9a20-77028dfad0fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200106 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200258 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200285 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200307 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpw8f\" (UniqueName: \"kubernetes.io/projected/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-kube-api-access-vpw8f\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200328 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c34a0280-1ca8-411a-9a20-77028dfad0fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200346 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql6jt\" (UniqueName: \"kubernetes.io/projected/c34a0280-1ca8-411a-9a20-77028dfad0fd-kube-api-access-ql6jt\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200364 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200380 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200397 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c34a0280-1ca8-411a-9a20-77028dfad0fd-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200414 4689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-logs\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200431 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200450 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200741 4689 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-dev\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200812 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200830 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.200959 4689 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-sys\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.201002 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c34a0280-1ca8-411a-9a20-77028dfad0fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.201020 4689 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.201032 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.201066 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.201079 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.201092 4689 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c34a0280-1ca8-411a-9a20-77028dfad0fd-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.201103 4689 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.201116 4689 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-run\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.201127 4689 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c34a0280-1ca8-411a-9a20-77028dfad0fd-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.201138 4689 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.201150 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.201161 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.213441 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.213505 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.225423 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.233250 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.302107 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.302149 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.302163 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.302200 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.373970 4689 generic.go:334] "Generic (PLEG): container finished" podID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" containerID="2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff" exitCode=0 Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.374018 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5","Type":"ContainerDied","Data":"2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff"} Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.376442 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5","Type":"ContainerDied","Data":"a1532bb81bff1fcad71188b2e8b776666b91a586715db81b97783ee7814f0b20"} Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.376623 4689 scope.go:117] "RemoveContainer" containerID="2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.374093 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.388902 4689 generic.go:334] "Generic (PLEG): container finished" podID="c34a0280-1ca8-411a-9a20-77028dfad0fd" containerID="0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9" exitCode=0 Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.389007 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c34a0280-1ca8-411a-9a20-77028dfad0fd","Type":"ContainerDied","Data":"0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9"} Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.389043 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c34a0280-1ca8-411a-9a20-77028dfad0fd","Type":"ContainerDied","Data":"d0f94fa5b49bb7271c458f81dcaaf866f8a18964513fee6ccc9707792cb9d715"} Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.389159 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.398760 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" event={"ID":"cd25307b-477c-49d2-a59c-b492f873be79","Type":"ContainerDied","Data":"419234ef6de02bd19df46738ba5cd433383268297794d644662d62fe47eb5860"} Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.398796 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="419234ef6de02bd19df46738ba5cd433383268297794d644662d62fe47eb5860" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.398866 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance1dad-account-delete-zpt88" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.438481 4689 scope.go:117] "RemoveContainer" containerID="21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.449125 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.462784 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.469912 4689 scope.go:117] "RemoveContainer" containerID="2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff" Mar 07 04:44:27 crc kubenswrapper[4689]: E0307 04:44:27.470799 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff\": container with ID starting with 2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff not found: ID does not exist" containerID="2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.470862 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff"} err="failed to get container status \"2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff\": rpc error: code = NotFound desc = could not find container \"2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff\": container with ID starting with 2b2e069ea101fdb546218c76f3ca30a8abf3954825e5b8fb274f2b182e821cff not found: ID does not exist" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.470897 4689 scope.go:117] "RemoveContainer" containerID="21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e" Mar 07 04:44:27 crc kubenswrapper[4689]: E0307 04:44:27.471433 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e\": container with ID starting with 21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e not found: ID does not exist" containerID="21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.471517 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e"} err="failed to get container status \"21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e\": rpc error: code = NotFound desc = could not find container \"21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e\": container with ID starting with 21cc461ca19960332a2fde0b4a9d86a1d8152ac2ae671e62a13d4273b981465e not found: ID does not exist" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.471544 4689 scope.go:117] "RemoveContainer" containerID="0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.472144 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.480961 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.491942 4689 scope.go:117] "RemoveContainer" containerID="b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.506024 4689 scope.go:117] "RemoveContainer" containerID="0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9" Mar 07 04:44:27 crc kubenswrapper[4689]: E0307 04:44:27.506560 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9\": container with ID starting with 0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9 not found: ID does not exist" containerID="0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.506599 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9"} err="failed to get container status \"0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9\": rpc error: code = NotFound desc = could not find container \"0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9\": container with ID starting with 0abc5f80ec4dc06181173dace8d1651223d7653ae8a6edd0cdab5c5a7e71e0b9 not found: ID does not exist" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.506624 4689 scope.go:117] "RemoveContainer" containerID="b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e" Mar 07 04:44:27 crc kubenswrapper[4689]: E0307 04:44:27.506989 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e\": container with ID starting with b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e not found: ID does not exist" containerID="b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.507022 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e"} err="failed to get container status \"b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e\": rpc error: code = NotFound desc = could not find container \"b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e\": container with ID starting with b310c4920a556194d98be33f05e26033cd601c9efe79e096fe4c7a59dc3e903e not found: ID does not exist" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.838488 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a182ec7-71d1-41d1-adf9-6542525e21ae" path="/var/lib/kubelet/pods/4a182ec7-71d1-41d1-adf9-6542525e21ae/volumes" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.839333 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7431fc02-96c1-4a55-aad6-83c23610f7a0" path="/var/lib/kubelet/pods/7431fc02-96c1-4a55-aad6-83c23610f7a0/volumes" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.840114 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34a0280-1ca8-411a-9a20-77028dfad0fd" path="/var/lib/kubelet/pods/c34a0280-1ca8-411a-9a20-77028dfad0fd/volumes" Mar 07 04:44:27 crc kubenswrapper[4689]: I0307 04:44:27.841610 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" path="/var/lib/kubelet/pods/df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5/volumes" Mar 07 04:44:28 crc kubenswrapper[4689]: I0307 04:44:28.361129 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-65kgs"] Mar 07 04:44:28 crc kubenswrapper[4689]: I0307 04:44:28.368467 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-65kgs"] Mar 07 04:44:28 crc kubenswrapper[4689]: I0307 04:44:28.376063 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance1dad-account-delete-zpt88"] Mar 07 04:44:28 crc kubenswrapper[4689]: I0307 04:44:28.384044 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-1dad-account-create-update-b4266"] Mar 07 04:44:28 crc kubenswrapper[4689]: I0307 04:44:28.390476 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance1dad-account-delete-zpt88"] Mar 07 04:44:28 crc kubenswrapper[4689]: I0307 04:44:28.396397 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-1dad-account-create-update-b4266"] Mar 07 04:44:29 crc kubenswrapper[4689]: I0307 04:44:29.189757 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:44:29 crc kubenswrapper[4689]: I0307 04:44:29.189832 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:44:29 crc kubenswrapper[4689]: I0307 04:44:29.841303 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd" path="/var/lib/kubelet/pods/11f632ac-bf0e-40d0-bcf8-4d5ed1893ccd/volumes" Mar 07 04:44:29 crc kubenswrapper[4689]: I0307 04:44:29.842354 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d73e07-546c-4f2f-8802-ba074301609e" path="/var/lib/kubelet/pods/a2d73e07-546c-4f2f-8802-ba074301609e/volumes" Mar 07 04:44:29 crc kubenswrapper[4689]: I0307 04:44:29.843299 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd25307b-477c-49d2-a59c-b492f873be79" path="/var/lib/kubelet/pods/cd25307b-477c-49d2-a59c-b492f873be79/volumes" Mar 07 04:44:36 crc kubenswrapper[4689]: I0307 04:44:36.078476 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/root-account-create-update-ndcqp"] Mar 07 04:44:36 crc kubenswrapper[4689]: I0307 04:44:36.092102 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/root-account-create-update-ndcqp"] Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.468236 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-ldm2v"] Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.495010 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-ldm2v"] Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.513999 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.516071 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-server" containerID="cri-o://c9115983fb96eb604ca6eee60e5a2764c938cbe715a4b97fa1cffc9f4cfcf61f" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.516449 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="swift-recon-cron" containerID="cri-o://af3a3e3771dc5dcb25112f7477a92fb7553646ad838f63f1cf844231472aa223" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.516499 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="rsync" containerID="cri-o://a460a955db79fbf911a926367b117b7f6ceb0c5df6dbccaeddba0833bd8d1785" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.516532 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-expirer" containerID="cri-o://1f89449ad1a80fea4286fef990935e960507f7bf84f0d843d58e0d743c6402d3" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.517476 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-updater" containerID="cri-o://d0fa234be29bc574f8e8ac0e9059a71a0665d50a4a5e8587b656627fe358168d" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.517514 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-auditor" containerID="cri-o://fcc4fd1908f707c3a9b6e85c0ed1a296725aeb2ace62d4136b7cfdf7e4793cb9" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.517544 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-replicator" containerID="cri-o://838f0a0e47edc581a3586403409c00fe391e1a0446670e6c3dae72a34453a3a6" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.517594 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-auditor" containerID="cri-o://95c6d4b787a84767360101ff6c8db1dcbf368d75db58bcc4657444d42e1121e2" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.517625 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-reaper" containerID="cri-o://70864ec57f40a96c0cbd682f99e5cc28caf7680eef12aaeebbb5fef77b84ca71" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.517680 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-auditor" containerID="cri-o://faad358fd307a99964689f91a5acb7e967ffb6178743b7f718e092bf976a7e8d" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.517624 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-server" containerID="cri-o://1ce86596f91d66453e465f97afa8624aca1b2b8b2d59d3a5f990349cc84881ae" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.517717 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-updater" containerID="cri-o://df2b64bed9e2330912063f36cf4cceb10965d467dee93369db2730c1e257474e" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.517757 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-replicator" containerID="cri-o://1b3960e36d0b90b01c78ef9cdfc8857c059e03d4fc0b35cebbdbda9d25c2e743" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.517792 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-replicator" containerID="cri-o://4cf74cb6827c9d9ba68e8c8dfa337659418d093ae76fce1056d4f84b43758ab5" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.517844 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-server" containerID="cri-o://413ff247560a52a36969f0cf2f05c5b652d77df05c0d4413b58fcf079e14f38c" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.541650 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc"] Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.541888 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" podUID="6e87b1f1-2509-4c3b-9c4e-c034d697f49b" containerName="proxy-httpd" containerID="cri-o://26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.541996 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" podUID="6e87b1f1-2509-4c3b-9c4e-c034d697f49b" containerName="proxy-server" containerID="cri-o://45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6" gracePeriod=30 Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.836320 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6542a99c-674f-41dc-ae19-f780414bbbf3" path="/var/lib/kubelet/pods/6542a99c-674f-41dc-ae19-f780414bbbf3/volumes" Mar 07 04:44:37 crc kubenswrapper[4689]: I0307 04:44:37.837472 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82016afa-d3dc-4bd8-ae60-db43a0960865" path="/var/lib/kubelet/pods/82016afa-d3dc-4bd8-ae60-db43a0960865/volumes" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537316 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="a460a955db79fbf911a926367b117b7f6ceb0c5df6dbccaeddba0833bd8d1785" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537667 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="1f89449ad1a80fea4286fef990935e960507f7bf84f0d843d58e0d743c6402d3" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537678 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="d0fa234be29bc574f8e8ac0e9059a71a0665d50a4a5e8587b656627fe358168d" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537685 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="fcc4fd1908f707c3a9b6e85c0ed1a296725aeb2ace62d4136b7cfdf7e4793cb9" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537692 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="838f0a0e47edc581a3586403409c00fe391e1a0446670e6c3dae72a34453a3a6" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537699 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="c9115983fb96eb604ca6eee60e5a2764c938cbe715a4b97fa1cffc9f4cfcf61f" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537706 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="df2b64bed9e2330912063f36cf4cceb10965d467dee93369db2730c1e257474e" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537713 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="faad358fd307a99964689f91a5acb7e967ffb6178743b7f718e092bf976a7e8d" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537720 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="1b3960e36d0b90b01c78ef9cdfc8857c059e03d4fc0b35cebbdbda9d25c2e743" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537726 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="1ce86596f91d66453e465f97afa8624aca1b2b8b2d59d3a5f990349cc84881ae" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537378 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"a460a955db79fbf911a926367b117b7f6ceb0c5df6dbccaeddba0833bd8d1785"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537764 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"1f89449ad1a80fea4286fef990935e960507f7bf84f0d843d58e0d743c6402d3"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537733 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="70864ec57f40a96c0cbd682f99e5cc28caf7680eef12aaeebbb5fef77b84ca71" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537818 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="95c6d4b787a84767360101ff6c8db1dcbf368d75db58bcc4657444d42e1121e2" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537849 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="4cf74cb6827c9d9ba68e8c8dfa337659418d093ae76fce1056d4f84b43758ab5" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537777 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"d0fa234be29bc574f8e8ac0e9059a71a0665d50a4a5e8587b656627fe358168d"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537868 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="413ff247560a52a36969f0cf2f05c5b652d77df05c0d4413b58fcf079e14f38c" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537882 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"fcc4fd1908f707c3a9b6e85c0ed1a296725aeb2ace62d4136b7cfdf7e4793cb9"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537895 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"838f0a0e47edc581a3586403409c00fe391e1a0446670e6c3dae72a34453a3a6"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537904 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"c9115983fb96eb604ca6eee60e5a2764c938cbe715a4b97fa1cffc9f4cfcf61f"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537912 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"df2b64bed9e2330912063f36cf4cceb10965d467dee93369db2730c1e257474e"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537921 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"faad358fd307a99964689f91a5acb7e967ffb6178743b7f718e092bf976a7e8d"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537929 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"1b3960e36d0b90b01c78ef9cdfc8857c059e03d4fc0b35cebbdbda9d25c2e743"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537939 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"1ce86596f91d66453e465f97afa8624aca1b2b8b2d59d3a5f990349cc84881ae"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537947 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"70864ec57f40a96c0cbd682f99e5cc28caf7680eef12aaeebbb5fef77b84ca71"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537955 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"95c6d4b787a84767360101ff6c8db1dcbf368d75db58bcc4657444d42e1121e2"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537964 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"4cf74cb6827c9d9ba68e8c8dfa337659418d093ae76fce1056d4f84b43758ab5"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.537973 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"413ff247560a52a36969f0cf2f05c5b652d77df05c0d4413b58fcf079e14f38c"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.539440 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.540066 4689 generic.go:334] "Generic (PLEG): container finished" podID="6e87b1f1-2509-4c3b-9c4e-c034d697f49b" containerID="45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.540080 4689 generic.go:334] "Generic (PLEG): container finished" podID="6e87b1f1-2509-4c3b-9c4e-c034d697f49b" containerID="26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614" exitCode=0 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.540094 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" event={"ID":"6e87b1f1-2509-4c3b-9c4e-c034d697f49b","Type":"ContainerDied","Data":"45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.540108 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" event={"ID":"6e87b1f1-2509-4c3b-9c4e-c034d697f49b","Type":"ContainerDied","Data":"26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614"} Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.540122 4689 scope.go:117] "RemoveContainer" containerID="45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.566636 4689 scope.go:117] "RemoveContainer" containerID="26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.583151 4689 scope.go:117] "RemoveContainer" containerID="45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6" Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.583648 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6\": container with ID starting with 45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6 not found: ID does not exist" containerID="45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.583706 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6"} err="failed to get container status \"45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6\": rpc error: code = NotFound desc = could not find container \"45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6\": container with ID starting with 45d6a480cc74ddc5c2d95301b14be917d7482bbf8d5d7f31842963b7376512c6 not found: ID does not exist" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.583738 4689 scope.go:117] "RemoveContainer" containerID="26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614" Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.584101 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614\": container with ID starting with 26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614 not found: ID does not exist" containerID="26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.584132 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614"} err="failed to get container status \"26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614\": rpc error: code = NotFound desc = could not find container \"26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614\": container with ID starting with 26178a19b09ae60e94754c2804300118b10f66ccc36dc17a0083b72cba736614 not found: ID does not exist" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.694508 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-config-data\") pod \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.694560 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-run-httpd\") pod \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.694592 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-log-httpd\") pod \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.694637 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift\") pod \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.694787 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmr4v\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-kube-api-access-jmr4v\") pod \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\" (UID: \"6e87b1f1-2509-4c3b-9c4e-c034d697f49b\") " Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.695178 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e87b1f1-2509-4c3b-9c4e-c034d697f49b" (UID: "6e87b1f1-2509-4c3b-9c4e-c034d697f49b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.695460 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e87b1f1-2509-4c3b-9c4e-c034d697f49b" (UID: "6e87b1f1-2509-4c3b-9c4e-c034d697f49b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.711648 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-kube-api-access-jmr4v" (OuterVolumeSpecName: "kube-api-access-jmr4v") pod "6e87b1f1-2509-4c3b-9c4e-c034d697f49b" (UID: "6e87b1f1-2509-4c3b-9c4e-c034d697f49b"). InnerVolumeSpecName "kube-api-access-jmr4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.737365 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6e87b1f1-2509-4c3b-9c4e-c034d697f49b" (UID: "6e87b1f1-2509-4c3b-9c4e-c034d697f49b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.778498 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-config-data" (OuterVolumeSpecName: "config-data") pod "6e87b1f1-2509-4c3b-9c4e-c034d697f49b" (UID: "6e87b1f1-2509-4c3b-9c4e-c034d697f49b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.798994 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmr4v\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-kube-api-access-jmr4v\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.799030 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.799041 4689 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.799049 4689 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.799058 4689 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e87b1f1-2509-4c3b-9c4e-c034d697f49b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.862465 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-vxm85"] Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.869376 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-s5fgw"] Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.870180 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-vxm85"] Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.883599 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-s5fgw"] Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.888935 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-69f7dd67f9-5tpdd"] Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.889377 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" podUID="01e7640b-0391-468f-b8d7-8d0078e52e5f" containerName="keystone-api" containerID="cri-o://87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86" gracePeriod=30 Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933010 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystonedf65-account-delete-blxfv"] Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.933356 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e87b1f1-2509-4c3b-9c4e-c034d697f49b" containerName="proxy-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933375 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e87b1f1-2509-4c3b-9c4e-c034d697f49b" containerName="proxy-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.933394 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a182ec7-71d1-41d1-adf9-6542525e21ae" containerName="glance-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933403 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a182ec7-71d1-41d1-adf9-6542525e21ae" containerName="glance-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.933416 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c34a0280-1ca8-411a-9a20-77028dfad0fd" containerName="glance-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933424 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34a0280-1ca8-411a-9a20-77028dfad0fd" containerName="glance-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.933433 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c34a0280-1ca8-411a-9a20-77028dfad0fd" containerName="glance-log" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933440 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34a0280-1ca8-411a-9a20-77028dfad0fd" containerName="glance-log" Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.933453 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" containerName="glance-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933460 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" containerName="glance-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.933473 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a182ec7-71d1-41d1-adf9-6542525e21ae" containerName="glance-log" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933479 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a182ec7-71d1-41d1-adf9-6542525e21ae" containerName="glance-log" Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.933489 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7431fc02-96c1-4a55-aad6-83c23610f7a0" containerName="glance-log" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933497 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="7431fc02-96c1-4a55-aad6-83c23610f7a0" containerName="glance-log" Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.933510 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e87b1f1-2509-4c3b-9c4e-c034d697f49b" containerName="proxy-server" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933516 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e87b1f1-2509-4c3b-9c4e-c034d697f49b" containerName="proxy-server" Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.933528 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd25307b-477c-49d2-a59c-b492f873be79" containerName="mariadb-account-delete" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933535 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd25307b-477c-49d2-a59c-b492f873be79" containerName="mariadb-account-delete" Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.933549 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" containerName="glance-log" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933556 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" containerName="glance-log" Mar 07 04:44:38 crc kubenswrapper[4689]: E0307 04:44:38.933571 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7431fc02-96c1-4a55-aad6-83c23610f7a0" containerName="glance-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933578 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="7431fc02-96c1-4a55-aad6-83c23610f7a0" containerName="glance-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933732 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" containerName="glance-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933746 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="df1432a0-b2f0-4c75-9fa6-9e3d5b1d16c5" containerName="glance-log" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933757 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="7431fc02-96c1-4a55-aad6-83c23610f7a0" containerName="glance-log" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933767 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="c34a0280-1ca8-411a-9a20-77028dfad0fd" containerName="glance-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933781 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e87b1f1-2509-4c3b-9c4e-c034d697f49b" containerName="proxy-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933792 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a182ec7-71d1-41d1-adf9-6542525e21ae" containerName="glance-log" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933804 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="c34a0280-1ca8-411a-9a20-77028dfad0fd" containerName="glance-log" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933816 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e87b1f1-2509-4c3b-9c4e-c034d697f49b" containerName="proxy-server" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933826 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd25307b-477c-49d2-a59c-b492f873be79" containerName="mariadb-account-delete" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933835 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a182ec7-71d1-41d1-adf9-6542525e21ae" containerName="glance-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.933846 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="7431fc02-96c1-4a55-aad6-83c23610f7a0" containerName="glance-httpd" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.934482 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" Mar 07 04:44:38 crc kubenswrapper[4689]: I0307 04:44:38.946515 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystonedf65-account-delete-blxfv"] Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.103916 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvp48\" (UniqueName: \"kubernetes.io/projected/71f5f795-049e-4dd3-b436-553b6f16e650-kube-api-access-hvp48\") pod \"keystonedf65-account-delete-blxfv\" (UID: \"71f5f795-049e-4dd3-b436-553b6f16e650\") " pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.103999 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts\") pod \"keystonedf65-account-delete-blxfv\" (UID: \"71f5f795-049e-4dd3-b436-553b6f16e650\") " pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.205683 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvp48\" (UniqueName: \"kubernetes.io/projected/71f5f795-049e-4dd3-b436-553b6f16e650-kube-api-access-hvp48\") pod \"keystonedf65-account-delete-blxfv\" (UID: \"71f5f795-049e-4dd3-b436-553b6f16e650\") " pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.205773 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts\") pod \"keystonedf65-account-delete-blxfv\" (UID: \"71f5f795-049e-4dd3-b436-553b6f16e650\") " pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.207436 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts\") pod \"keystonedf65-account-delete-blxfv\" (UID: \"71f5f795-049e-4dd3-b436-553b6f16e650\") " pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.237343 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvp48\" (UniqueName: \"kubernetes.io/projected/71f5f795-049e-4dd3-b436-553b6f16e650-kube-api-access-hvp48\") pod \"keystonedf65-account-delete-blxfv\" (UID: \"71f5f795-049e-4dd3-b436-553b6f16e650\") " pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.249708 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.549599 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" event={"ID":"6e87b1f1-2509-4c3b-9c4e-c034d697f49b","Type":"ContainerDied","Data":"d2d58c6aff3ee5415fcd36b566734413bd182221febb561826600f2edd68e397"} Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.549654 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.582728 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc"] Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.589664 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-proxy-7c5699d58c-8dzvc"] Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.730541 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/root-account-create-update-hqn9c"] Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.732993 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-hqn9c" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.751965 4689 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-mariadb-root-db-secret" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.779657 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-hqn9c"] Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.795751 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystonedf65-account-delete-blxfv"] Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.802739 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.819105 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcfqh\" (UniqueName: \"kubernetes.io/projected/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-kube-api-access-tcfqh\") pod \"root-account-create-update-hqn9c\" (UID: \"6fd5edb3-f4eb-4993-8fd1-52e01de3aece\") " pod="glance-kuttl-tests/root-account-create-update-hqn9c" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.819477 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-operator-scripts\") pod \"root-account-create-update-hqn9c\" (UID: \"6fd5edb3-f4eb-4993-8fd1-52e01de3aece\") " pod="glance-kuttl-tests/root-account-create-update-hqn9c" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.856288 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c1087ab-41b1-4140-87b2-96191dfe1928" path="/var/lib/kubelet/pods/2c1087ab-41b1-4140-87b2-96191dfe1928/volumes" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.858222 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e87b1f1-2509-4c3b-9c4e-c034d697f49b" path="/var/lib/kubelet/pods/6e87b1f1-2509-4c3b-9c4e-c034d697f49b/volumes" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.860154 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1" path="/var/lib/kubelet/pods/f6c1a0bd-8705-4cd3-9ae6-db1e3bf87bd1/volumes" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.861562 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.861695 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.880480 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/root-account-create-update-hqn9c"] Mar 07 04:44:39 crc kubenswrapper[4689]: E0307 04:44:39.881117 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-tcfqh operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="glance-kuttl-tests/root-account-create-update-hqn9c" podUID="6fd5edb3-f4eb-4993-8fd1-52e01de3aece" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.920821 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-operator-scripts\") pod \"root-account-create-update-hqn9c\" (UID: \"6fd5edb3-f4eb-4993-8fd1-52e01de3aece\") " pod="glance-kuttl-tests/root-account-create-update-hqn9c" Mar 07 04:44:39 crc kubenswrapper[4689]: I0307 04:44:39.920931 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcfqh\" (UniqueName: \"kubernetes.io/projected/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-kube-api-access-tcfqh\") pod \"root-account-create-update-hqn9c\" (UID: \"6fd5edb3-f4eb-4993-8fd1-52e01de3aece\") " pod="glance-kuttl-tests/root-account-create-update-hqn9c" Mar 07 04:44:39 crc kubenswrapper[4689]: E0307 04:44:39.921361 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 07 04:44:39 crc kubenswrapper[4689]: E0307 04:44:39.922094 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-operator-scripts podName:6fd5edb3-f4eb-4993-8fd1-52e01de3aece nodeName:}" failed. No retries permitted until 2026-03-07 04:44:40.422077537 +0000 UTC m=+1525.468461016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-operator-scripts") pod "root-account-create-update-hqn9c" (UID: "6fd5edb3-f4eb-4993-8fd1-52e01de3aece") : configmap "openstack-scripts" not found Mar 07 04:44:39 crc kubenswrapper[4689]: E0307 04:44:39.927912 4689 projected.go:194] Error preparing data for projected volume kube-api-access-tcfqh for pod glance-kuttl-tests/root-account-create-update-hqn9c: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 07 04:44:39 crc kubenswrapper[4689]: E0307 04:44:39.927962 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-kube-api-access-tcfqh podName:6fd5edb3-f4eb-4993-8fd1-52e01de3aece nodeName:}" failed. No retries permitted until 2026-03-07 04:44:40.427948815 +0000 UTC m=+1525.474332294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tcfqh" (UniqueName: "kubernetes.io/projected/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-kube-api-access-tcfqh") pod "root-account-create-update-hqn9c" (UID: "6fd5edb3-f4eb-4993-8fd1-52e01de3aece") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.003545 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstack-galera-2" podUID="243ddc02-c377-44ac-9b47-2240c3d9efed" containerName="galera" containerID="cri-o://78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de" gracePeriod=30 Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.426303 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/memcached-0"] Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.426537 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/memcached-0" podUID="c41b2833-be4f-46a8-b1fb-7c244ac8530b" containerName="memcached" containerID="cri-o://53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96" gracePeriod=30 Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.428291 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-operator-scripts\") pod \"root-account-create-update-hqn9c\" (UID: \"6fd5edb3-f4eb-4993-8fd1-52e01de3aece\") " pod="glance-kuttl-tests/root-account-create-update-hqn9c" Mar 07 04:44:40 crc kubenswrapper[4689]: E0307 04:44:40.428483 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 07 04:44:40 crc kubenswrapper[4689]: E0307 04:44:40.428597 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-operator-scripts podName:6fd5edb3-f4eb-4993-8fd1-52e01de3aece nodeName:}" failed. No retries permitted until 2026-03-07 04:44:41.428567478 +0000 UTC m=+1526.474951017 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-operator-scripts") pod "root-account-create-update-hqn9c" (UID: "6fd5edb3-f4eb-4993-8fd1-52e01de3aece") : configmap "openstack-scripts" not found Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.428707 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcfqh\" (UniqueName: \"kubernetes.io/projected/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-kube-api-access-tcfqh\") pod \"root-account-create-update-hqn9c\" (UID: \"6fd5edb3-f4eb-4993-8fd1-52e01de3aece\") " pod="glance-kuttl-tests/root-account-create-update-hqn9c" Mar 07 04:44:40 crc kubenswrapper[4689]: E0307 04:44:40.433157 4689 projected.go:194] Error preparing data for projected volume kube-api-access-tcfqh for pod glance-kuttl-tests/root-account-create-update-hqn9c: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 07 04:44:40 crc kubenswrapper[4689]: E0307 04:44:40.433848 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-kube-api-access-tcfqh podName:6fd5edb3-f4eb-4993-8fd1-52e01de3aece nodeName:}" failed. No retries permitted until 2026-03-07 04:44:41.433807958 +0000 UTC m=+1526.480191487 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tcfqh" (UniqueName: "kubernetes.io/projected/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-kube-api-access-tcfqh") pod "root-account-create-update-hqn9c" (UID: "6fd5edb3-f4eb-4993-8fd1-52e01de3aece") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.561365 4689 generic.go:334] "Generic (PLEG): container finished" podID="71f5f795-049e-4dd3-b436-553b6f16e650" containerID="55b527ea27cdab9e0f7b0138466eb3de0e5051a3e0d0f303729644b17f20390e" exitCode=1 Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.561447 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-hqn9c" Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.561708 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" event={"ID":"71f5f795-049e-4dd3-b436-553b6f16e650","Type":"ContainerDied","Data":"55b527ea27cdab9e0f7b0138466eb3de0e5051a3e0d0f303729644b17f20390e"} Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.561736 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" event={"ID":"71f5f795-049e-4dd3-b436-553b6f16e650","Type":"ContainerStarted","Data":"0da80b5b428593d5db4d3884861c2660274a6410e52cf196218d53133d569689"} Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.562129 4689 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" secret="" err="secret \"galera-openstack-dockercfg-4lnkm\" not found" Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.562193 4689 scope.go:117] "RemoveContainer" containerID="55b527ea27cdab9e0f7b0138466eb3de0e5051a3e0d0f303729644b17f20390e" Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.598448 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-hqn9c" Mar 07 04:44:40 crc kubenswrapper[4689]: E0307 04:44:40.732474 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 07 04:44:40 crc kubenswrapper[4689]: E0307 04:44:40.732574 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts podName:71f5f795-049e-4dd3-b436-553b6f16e650 nodeName:}" failed. No retries permitted until 2026-03-07 04:44:41.232548772 +0000 UTC m=+1526.278932311 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts") pod "keystonedf65-account-delete-blxfv" (UID: "71f5f795-049e-4dd3-b436-553b6f16e650") : configmap "openstack-scripts" not found Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.939245 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:44:40 crc kubenswrapper[4689]: I0307 04:44:40.982788 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.036664 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"243ddc02-c377-44ac-9b47-2240c3d9efed\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.036755 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-operator-scripts\") pod \"243ddc02-c377-44ac-9b47-2240c3d9efed\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.036791 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-config-data-default\") pod \"243ddc02-c377-44ac-9b47-2240c3d9efed\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.036814 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-kolla-config\") pod \"243ddc02-c377-44ac-9b47-2240c3d9efed\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.036879 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqsp8\" (UniqueName: \"kubernetes.io/projected/243ddc02-c377-44ac-9b47-2240c3d9efed-kube-api-access-nqsp8\") pod \"243ddc02-c377-44ac-9b47-2240c3d9efed\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.036956 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/243ddc02-c377-44ac-9b47-2240c3d9efed-config-data-generated\") pod \"243ddc02-c377-44ac-9b47-2240c3d9efed\" (UID: \"243ddc02-c377-44ac-9b47-2240c3d9efed\") " Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.037300 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/243ddc02-c377-44ac-9b47-2240c3d9efed-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "243ddc02-c377-44ac-9b47-2240c3d9efed" (UID: "243ddc02-c377-44ac-9b47-2240c3d9efed"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.037398 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "243ddc02-c377-44ac-9b47-2240c3d9efed" (UID: "243ddc02-c377-44ac-9b47-2240c3d9efed"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.037432 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "243ddc02-c377-44ac-9b47-2240c3d9efed" (UID: "243ddc02-c377-44ac-9b47-2240c3d9efed"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.037528 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "243ddc02-c377-44ac-9b47-2240c3d9efed" (UID: "243ddc02-c377-44ac-9b47-2240c3d9efed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.046154 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "mysql-db") pod "243ddc02-c377-44ac-9b47-2240c3d9efed" (UID: "243ddc02-c377-44ac-9b47-2240c3d9efed"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.048983 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/243ddc02-c377-44ac-9b47-2240c3d9efed-kube-api-access-nqsp8" (OuterVolumeSpecName: "kube-api-access-nqsp8") pod "243ddc02-c377-44ac-9b47-2240c3d9efed" (UID: "243ddc02-c377-44ac-9b47-2240c3d9efed"). InnerVolumeSpecName "kube-api-access-nqsp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.138520 4689 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/243ddc02-c377-44ac-9b47-2240c3d9efed-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.138562 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.138594 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.138605 4689 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.138616 4689 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/243ddc02-c377-44ac-9b47-2240c3d9efed-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.138624 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqsp8\" (UniqueName: \"kubernetes.io/projected/243ddc02-c377-44ac-9b47-2240c3d9efed-kube-api-access-nqsp8\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.150956 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Mar 07 04:44:41 crc kubenswrapper[4689]: E0307 04:44:41.240797 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.240869 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:41 crc kubenswrapper[4689]: E0307 04:44:41.241035 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts podName:71f5f795-049e-4dd3-b436-553b6f16e650 nodeName:}" failed. No retries permitted until 2026-03-07 04:44:42.241008275 +0000 UTC m=+1527.287391804 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts") pod "keystonedf65-account-delete-blxfv" (UID: "71f5f795-049e-4dd3-b436-553b6f16e650") : configmap "openstack-scripts" not found Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.406735 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.443596 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-operator-scripts\") pod \"root-account-create-update-hqn9c\" (UID: \"6fd5edb3-f4eb-4993-8fd1-52e01de3aece\") " pod="glance-kuttl-tests/root-account-create-update-hqn9c" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.443682 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcfqh\" (UniqueName: \"kubernetes.io/projected/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-kube-api-access-tcfqh\") pod \"root-account-create-update-hqn9c\" (UID: \"6fd5edb3-f4eb-4993-8fd1-52e01de3aece\") " pod="glance-kuttl-tests/root-account-create-update-hqn9c" Mar 07 04:44:41 crc kubenswrapper[4689]: E0307 04:44:41.443778 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 07 04:44:41 crc kubenswrapper[4689]: E0307 04:44:41.443855 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-operator-scripts podName:6fd5edb3-f4eb-4993-8fd1-52e01de3aece nodeName:}" failed. No retries permitted until 2026-03-07 04:44:43.44383709 +0000 UTC m=+1528.490220579 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-operator-scripts") pod "root-account-create-update-hqn9c" (UID: "6fd5edb3-f4eb-4993-8fd1-52e01de3aece") : configmap "openstack-scripts" not found Mar 07 04:44:41 crc kubenswrapper[4689]: E0307 04:44:41.446341 4689 projected.go:194] Error preparing data for projected volume kube-api-access-tcfqh for pod glance-kuttl-tests/root-account-create-update-hqn9c: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 07 04:44:41 crc kubenswrapper[4689]: E0307 04:44:41.446410 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-kube-api-access-tcfqh podName:6fd5edb3-f4eb-4993-8fd1-52e01de3aece nodeName:}" failed. No retries permitted until 2026-03-07 04:44:43.446385828 +0000 UTC m=+1528.492769317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tcfqh" (UniqueName: "kubernetes.io/projected/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-kube-api-access-tcfqh") pod "root-account-create-update-hqn9c" (UID: "6fd5edb3-f4eb-4993-8fd1-52e01de3aece") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.570284 4689 generic.go:334] "Generic (PLEG): container finished" podID="71f5f795-049e-4dd3-b436-553b6f16e650" containerID="aab256129d20879de068e6dc30461887f7d5802f010964b5481f8b1b2cec85c2" exitCode=1 Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.570333 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" event={"ID":"71f5f795-049e-4dd3-b436-553b6f16e650","Type":"ContainerDied","Data":"aab256129d20879de068e6dc30461887f7d5802f010964b5481f8b1b2cec85c2"} Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.570390 4689 scope.go:117] "RemoveContainer" containerID="55b527ea27cdab9e0f7b0138466eb3de0e5051a3e0d0f303729644b17f20390e" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.570762 4689 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" secret="" err="secret \"galera-openstack-dockercfg-4lnkm\" not found" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.570814 4689 scope.go:117] "RemoveContainer" containerID="aab256129d20879de068e6dc30461887f7d5802f010964b5481f8b1b2cec85c2" Mar 07 04:44:41 crc kubenswrapper[4689]: E0307 04:44:41.571111 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystonedf65-account-delete-blxfv_glance-kuttl-tests(71f5f795-049e-4dd3-b436-553b6f16e650)\"" pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" podUID="71f5f795-049e-4dd3-b436-553b6f16e650" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.573147 4689 generic.go:334] "Generic (PLEG): container finished" podID="243ddc02-c377-44ac-9b47-2240c3d9efed" containerID="78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de" exitCode=0 Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.573217 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-hqn9c" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.573212 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"243ddc02-c377-44ac-9b47-2240c3d9efed","Type":"ContainerDied","Data":"78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de"} Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.573262 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"243ddc02-c377-44ac-9b47-2240c3d9efed","Type":"ContainerDied","Data":"7947c3d03fcf0132f31811227f3583400e6eea68bf29f2a0ee53df51b711243f"} Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.573325 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.608998 4689 scope.go:117] "RemoveContainer" containerID="78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.632285 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/root-account-create-update-hqn9c"] Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.632696 4689 scope.go:117] "RemoveContainer" containerID="dc21aff73e3d3b1f30474aa95eb0c7b2ae6dc404bcf830c806fc5312775819fc" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.645425 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/root-account-create-update-hqn9c"] Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.649304 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/rabbitmq-server-0" podUID="b8758a96-64ae-4c03-b392-5aa8c68cc641" containerName="rabbitmq" containerID="cri-o://fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23" gracePeriod=604800 Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.669516 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.677130 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.678194 4689 scope.go:117] "RemoveContainer" containerID="78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de" Mar 07 04:44:41 crc kubenswrapper[4689]: E0307 04:44:41.678611 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de\": container with ID starting with 78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de not found: ID does not exist" containerID="78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.678659 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de"} err="failed to get container status \"78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de\": rpc error: code = NotFound desc = could not find container \"78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de\": container with ID starting with 78e373bdc492e60f5647201a4fff2ab4ee9ef76eca9ca1346e987752b754a3de not found: ID does not exist" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.678693 4689 scope.go:117] "RemoveContainer" containerID="dc21aff73e3d3b1f30474aa95eb0c7b2ae6dc404bcf830c806fc5312775819fc" Mar 07 04:44:41 crc kubenswrapper[4689]: E0307 04:44:41.679352 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc21aff73e3d3b1f30474aa95eb0c7b2ae6dc404bcf830c806fc5312775819fc\": container with ID starting with dc21aff73e3d3b1f30474aa95eb0c7b2ae6dc404bcf830c806fc5312775819fc not found: ID does not exist" containerID="dc21aff73e3d3b1f30474aa95eb0c7b2ae6dc404bcf830c806fc5312775819fc" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.679380 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc21aff73e3d3b1f30474aa95eb0c7b2ae6dc404bcf830c806fc5312775819fc"} err="failed to get container status \"dc21aff73e3d3b1f30474aa95eb0c7b2ae6dc404bcf830c806fc5312775819fc\": rpc error: code = NotFound desc = could not find container \"dc21aff73e3d3b1f30474aa95eb0c7b2ae6dc404bcf830c806fc5312775819fc\": container with ID starting with dc21aff73e3d3b1f30474aa95eb0c7b2ae6dc404bcf830c806fc5312775819fc not found: ID does not exist" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.747389 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.747442 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcfqh\" (UniqueName: \"kubernetes.io/projected/6fd5edb3-f4eb-4993-8fd1-52e01de3aece-kube-api-access-tcfqh\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.834225 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="243ddc02-c377-44ac-9b47-2240c3d9efed" path="/var/lib/kubelet/pods/243ddc02-c377-44ac-9b47-2240c3d9efed/volumes" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.834734 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd5edb3-f4eb-4993-8fd1-52e01de3aece" path="/var/lib/kubelet/pods/6fd5edb3-f4eb-4993-8fd1-52e01de3aece/volumes" Mar 07 04:44:41 crc kubenswrapper[4689]: I0307 04:44:41.894085 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/memcached-0" podUID="c41b2833-be4f-46a8-b1fb-7c244ac8530b" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.56:11211: connect: connection refused" Mar 07 04:44:42 crc kubenswrapper[4689]: I0307 04:44:42.042384 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstack-galera-1" podUID="26e0bab4-0913-4193-bb07-8d1802eda6c0" containerName="galera" containerID="cri-o://a15b111de1ac2c83ca80e44c5c4ff7f0530be43f7923b13afab7da027b126d2f" gracePeriod=28 Mar 07 04:44:42 crc kubenswrapper[4689]: E0307 04:44:42.259156 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 07 04:44:42 crc kubenswrapper[4689]: E0307 04:44:42.259273 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts podName:71f5f795-049e-4dd3-b436-553b6f16e650 nodeName:}" failed. No retries permitted until 2026-03-07 04:44:44.259254308 +0000 UTC m=+1529.305637797 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts") pod "keystonedf65-account-delete-blxfv" (UID: "71f5f795-049e-4dd3-b436-553b6f16e650") : configmap "openstack-scripts" not found Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.288144 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.347249 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.359933 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c41b2833-be4f-46a8-b1fb-7c244ac8530b-config-data\") pod \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\" (UID: \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.360052 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c41b2833-be4f-46a8-b1fb-7c244ac8530b-kolla-config\") pod \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\" (UID: \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.360077 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spvnz\" (UniqueName: \"kubernetes.io/projected/c41b2833-be4f-46a8-b1fb-7c244ac8530b-kube-api-access-spvnz\") pod \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\" (UID: \"c41b2833-be4f-46a8-b1fb-7c244ac8530b\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.363752 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41b2833-be4f-46a8-b1fb-7c244ac8530b-config-data" (OuterVolumeSpecName: "config-data") pod "c41b2833-be4f-46a8-b1fb-7c244ac8530b" (UID: "c41b2833-be4f-46a8-b1fb-7c244ac8530b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.364001 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41b2833-be4f-46a8-b1fb-7c244ac8530b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "c41b2833-be4f-46a8-b1fb-7c244ac8530b" (UID: "c41b2833-be4f-46a8-b1fb-7c244ac8530b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.376469 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41b2833-be4f-46a8-b1fb-7c244ac8530b-kube-api-access-spvnz" (OuterVolumeSpecName: "kube-api-access-spvnz") pod "c41b2833-be4f-46a8-b1fb-7c244ac8530b" (UID: "c41b2833-be4f-46a8-b1fb-7c244ac8530b"). InnerVolumeSpecName "kube-api-access-spvnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.402611 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.402801 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" podUID="c5573c74-db15-40d3-9e5a-fa66061ec3bb" containerName="manager" containerID="cri-o://0e3de09508b036f7aff417affb92d290088f4676c282117a5adf1d8b786704a1" gracePeriod=10 Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.461253 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-scripts\") pod \"01e7640b-0391-468f-b8d7-8d0078e52e5f\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.461308 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwrsm\" (UniqueName: \"kubernetes.io/projected/01e7640b-0391-468f-b8d7-8d0078e52e5f-kube-api-access-qwrsm\") pod \"01e7640b-0391-468f-b8d7-8d0078e52e5f\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.461415 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-credential-keys\") pod \"01e7640b-0391-468f-b8d7-8d0078e52e5f\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.461438 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-config-data\") pod \"01e7640b-0391-468f-b8d7-8d0078e52e5f\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.461456 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-fernet-keys\") pod \"01e7640b-0391-468f-b8d7-8d0078e52e5f\" (UID: \"01e7640b-0391-468f-b8d7-8d0078e52e5f\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.461733 4689 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c41b2833-be4f-46a8-b1fb-7c244ac8530b-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.461745 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spvnz\" (UniqueName: \"kubernetes.io/projected/c41b2833-be4f-46a8-b1fb-7c244ac8530b-kube-api-access-spvnz\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.461755 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c41b2833-be4f-46a8-b1fb-7c244ac8530b-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.465603 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "01e7640b-0391-468f-b8d7-8d0078e52e5f" (UID: "01e7640b-0391-468f-b8d7-8d0078e52e5f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.467864 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-scripts" (OuterVolumeSpecName: "scripts") pod "01e7640b-0391-468f-b8d7-8d0078e52e5f" (UID: "01e7640b-0391-468f-b8d7-8d0078e52e5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.469374 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e7640b-0391-468f-b8d7-8d0078e52e5f-kube-api-access-qwrsm" (OuterVolumeSpecName: "kube-api-access-qwrsm") pod "01e7640b-0391-468f-b8d7-8d0078e52e5f" (UID: "01e7640b-0391-468f-b8d7-8d0078e52e5f"). InnerVolumeSpecName "kube-api-access-qwrsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.479321 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "01e7640b-0391-468f-b8d7-8d0078e52e5f" (UID: "01e7640b-0391-468f-b8d7-8d0078e52e5f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.504072 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-config-data" (OuterVolumeSpecName: "config-data") pod "01e7640b-0391-468f-b8d7-8d0078e52e5f" (UID: "01e7640b-0391-468f-b8d7-8d0078e52e5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.563458 4689 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.563484 4689 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.563492 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwrsm\" (UniqueName: \"kubernetes.io/projected/01e7640b-0391-468f-b8d7-8d0078e52e5f-kube-api-access-qwrsm\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.563501 4689 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.563509 4689 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e7640b-0391-468f-b8d7-8d0078e52e5f-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.592891 4689 generic.go:334] "Generic (PLEG): container finished" podID="c5573c74-db15-40d3-9e5a-fa66061ec3bb" containerID="0e3de09508b036f7aff417affb92d290088f4676c282117a5adf1d8b786704a1" exitCode=0 Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.592958 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" event={"ID":"c5573c74-db15-40d3-9e5a-fa66061ec3bb","Type":"ContainerDied","Data":"0e3de09508b036f7aff417affb92d290088f4676c282117a5adf1d8b786704a1"} Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.597044 4689 generic.go:334] "Generic (PLEG): container finished" podID="01e7640b-0391-468f-b8d7-8d0078e52e5f" containerID="87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86" exitCode=0 Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.597119 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" event={"ID":"01e7640b-0391-468f-b8d7-8d0078e52e5f","Type":"ContainerDied","Data":"87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86"} Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.597145 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" event={"ID":"01e7640b-0391-468f-b8d7-8d0078e52e5f","Type":"ContainerDied","Data":"892cbbc83d0b091f4ac373a3a284acaa066b14d3f1dda0b69050cfd1ed6a34e5"} Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.597191 4689 scope.go:117] "RemoveContainer" containerID="87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.597354 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-69f7dd67f9-5tpdd" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.600460 4689 generic.go:334] "Generic (PLEG): container finished" podID="c41b2833-be4f-46a8-b1fb-7c244ac8530b" containerID="53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96" exitCode=0 Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.600511 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"c41b2833-be4f-46a8-b1fb-7c244ac8530b","Type":"ContainerDied","Data":"53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96"} Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.600526 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"c41b2833-be4f-46a8-b1fb-7c244ac8530b","Type":"ContainerDied","Data":"a4b54915837ec35d4b36960643c7576d6ea2b7df400722e9058394c5c654e041"} Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.600528 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.608278 4689 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" secret="" err="secret \"galera-openstack-dockercfg-4lnkm\" not found" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.608329 4689 scope.go:117] "RemoveContainer" containerID="aab256129d20879de068e6dc30461887f7d5802f010964b5481f8b1b2cec85c2" Mar 07 04:44:43 crc kubenswrapper[4689]: E0307 04:44:42.609366 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystonedf65-account-delete-blxfv_glance-kuttl-tests(71f5f795-049e-4dd3-b436-553b6f16e650)\"" pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" podUID="71f5f795-049e-4dd3-b436-553b6f16e650" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.649315 4689 scope.go:117] "RemoveContainer" containerID="87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86" Mar 07 04:44:43 crc kubenswrapper[4689]: E0307 04:44:42.662331 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86\": container with ID starting with 87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86 not found: ID does not exist" containerID="87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.662382 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86"} err="failed to get container status \"87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86\": rpc error: code = NotFound desc = could not find container \"87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86\": container with ID starting with 87d0d751d50f0db58d73fa2ecaf6a679823c0cf76dc5f0c3a185b6dcf31a4b86 not found: ID does not exist" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.662406 4689 scope.go:117] "RemoveContainer" containerID="53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.732562 4689 scope.go:117] "RemoveContainer" containerID="53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.733286 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-vr96j"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.733631 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-index-vr96j" podUID="bd8b0d1d-32da-409d-9453-bef0c8ca65f1" containerName="registry-server" containerID="cri-o://69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5" gracePeriod=30 Mar 07 04:44:43 crc kubenswrapper[4689]: E0307 04:44:42.740323 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96\": container with ID starting with 53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96 not found: ID does not exist" containerID="53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.740364 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96"} err="failed to get container status \"53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96\": rpc error: code = NotFound desc = could not find container \"53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96\": container with ID starting with 53c37414043853fb7bcc0360890ee54661bab211ef9a3f8c8ea54488f67d8f96 not found: ID does not exist" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.769467 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/memcached-0"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.794227 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/memcached-0"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.816329 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-69f7dd67f9-5tpdd"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.841369 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-69f7dd67f9-5tpdd"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.846987 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:42.865986 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/ebd64b31bae918b3ecf6bf74ee0b9df7b931253e0ecb9c915c5b7f7fdclllmr"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.285754 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.383299 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwt8t\" (UniqueName: \"kubernetes.io/projected/c5573c74-db15-40d3-9e5a-fa66061ec3bb-kube-api-access-wwt8t\") pod \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\" (UID: \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.383370 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5573c74-db15-40d3-9e5a-fa66061ec3bb-apiservice-cert\") pod \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\" (UID: \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.383430 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5573c74-db15-40d3-9e5a-fa66061ec3bb-webhook-cert\") pod \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\" (UID: \"c5573c74-db15-40d3-9e5a-fa66061ec3bb\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.387539 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5573c74-db15-40d3-9e5a-fa66061ec3bb-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "c5573c74-db15-40d3-9e5a-fa66061ec3bb" (UID: "c5573c74-db15-40d3-9e5a-fa66061ec3bb"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.387608 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5573c74-db15-40d3-9e5a-fa66061ec3bb-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "c5573c74-db15-40d3-9e5a-fa66061ec3bb" (UID: "c5573c74-db15-40d3-9e5a-fa66061ec3bb"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.387744 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5573c74-db15-40d3-9e5a-fa66061ec3bb-kube-api-access-wwt8t" (OuterVolumeSpecName: "kube-api-access-wwt8t") pod "c5573c74-db15-40d3-9e5a-fa66061ec3bb" (UID: "c5573c74-db15-40d3-9e5a-fa66061ec3bb"). InnerVolumeSpecName "kube-api-access-wwt8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.390343 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.394915 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-vr96j" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.484451 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xgkw\" (UniqueName: \"kubernetes.io/projected/bd8b0d1d-32da-409d-9453-bef0c8ca65f1-kube-api-access-6xgkw\") pod \"bd8b0d1d-32da-409d-9453-bef0c8ca65f1\" (UID: \"bd8b0d1d-32da-409d-9453-bef0c8ca65f1\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.484496 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt8nj\" (UniqueName: \"kubernetes.io/projected/b8758a96-64ae-4c03-b392-5aa8c68cc641-kube-api-access-zt8nj\") pod \"b8758a96-64ae-4c03-b392-5aa8c68cc641\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.484519 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-plugins\") pod \"b8758a96-64ae-4c03-b392-5aa8c68cc641\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.484539 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8758a96-64ae-4c03-b392-5aa8c68cc641-pod-info\") pod \"b8758a96-64ae-4c03-b392-5aa8c68cc641\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.484612 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8758a96-64ae-4c03-b392-5aa8c68cc641-erlang-cookie-secret\") pod \"b8758a96-64ae-4c03-b392-5aa8c68cc641\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.484674 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-confd\") pod \"b8758a96-64ae-4c03-b392-5aa8c68cc641\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.484791 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\") pod \"b8758a96-64ae-4c03-b392-5aa8c68cc641\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.484838 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-erlang-cookie\") pod \"b8758a96-64ae-4c03-b392-5aa8c68cc641\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.484865 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8758a96-64ae-4c03-b392-5aa8c68cc641-plugins-conf\") pod \"b8758a96-64ae-4c03-b392-5aa8c68cc641\" (UID: \"b8758a96-64ae-4c03-b392-5aa8c68cc641\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.485021 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b8758a96-64ae-4c03-b392-5aa8c68cc641" (UID: "b8758a96-64ae-4c03-b392-5aa8c68cc641"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.485125 4689 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.485137 4689 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5573c74-db15-40d3-9e5a-fa66061ec3bb-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.485145 4689 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5573c74-db15-40d3-9e5a-fa66061ec3bb-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.485155 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwt8t\" (UniqueName: \"kubernetes.io/projected/c5573c74-db15-40d3-9e5a-fa66061ec3bb-kube-api-access-wwt8t\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.485497 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8758a96-64ae-4c03-b392-5aa8c68cc641-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b8758a96-64ae-4c03-b392-5aa8c68cc641" (UID: "b8758a96-64ae-4c03-b392-5aa8c68cc641"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.485864 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b8758a96-64ae-4c03-b392-5aa8c68cc641" (UID: "b8758a96-64ae-4c03-b392-5aa8c68cc641"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.487674 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8758a96-64ae-4c03-b392-5aa8c68cc641-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b8758a96-64ae-4c03-b392-5aa8c68cc641" (UID: "b8758a96-64ae-4c03-b392-5aa8c68cc641"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.487806 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b8758a96-64ae-4c03-b392-5aa8c68cc641-pod-info" (OuterVolumeSpecName: "pod-info") pod "b8758a96-64ae-4c03-b392-5aa8c68cc641" (UID: "b8758a96-64ae-4c03-b392-5aa8c68cc641"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.489575 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd8b0d1d-32da-409d-9453-bef0c8ca65f1-kube-api-access-6xgkw" (OuterVolumeSpecName: "kube-api-access-6xgkw") pod "bd8b0d1d-32da-409d-9453-bef0c8ca65f1" (UID: "bd8b0d1d-32da-409d-9453-bef0c8ca65f1"). InnerVolumeSpecName "kube-api-access-6xgkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.492141 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8758a96-64ae-4c03-b392-5aa8c68cc641-kube-api-access-zt8nj" (OuterVolumeSpecName: "kube-api-access-zt8nj") pod "b8758a96-64ae-4c03-b392-5aa8c68cc641" (UID: "b8758a96-64ae-4c03-b392-5aa8c68cc641"). InnerVolumeSpecName "kube-api-access-zt8nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.504639 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152" (OuterVolumeSpecName: "persistence") pod "b8758a96-64ae-4c03-b392-5aa8c68cc641" (UID: "b8758a96-64ae-4c03-b392-5aa8c68cc641"). InnerVolumeSpecName "pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.560593 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b8758a96-64ae-4c03-b392-5aa8c68cc641" (UID: "b8758a96-64ae-4c03-b392-5aa8c68cc641"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.586349 4689 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.586612 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\") on node \"crc\" " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.586689 4689 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8758a96-64ae-4c03-b392-5aa8c68cc641-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.586774 4689 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8758a96-64ae-4c03-b392-5aa8c68cc641-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.586832 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xgkw\" (UniqueName: \"kubernetes.io/projected/bd8b0d1d-32da-409d-9453-bef0c8ca65f1-kube-api-access-6xgkw\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.586889 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt8nj\" (UniqueName: \"kubernetes.io/projected/b8758a96-64ae-4c03-b392-5aa8c68cc641-kube-api-access-zt8nj\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.586956 4689 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8758a96-64ae-4c03-b392-5aa8c68cc641-pod-info\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.587022 4689 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8758a96-64ae-4c03-b392-5aa8c68cc641-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.614163 4689 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.614478 4689 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152") on node "crc" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.622282 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.622285 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn" event={"ID":"c5573c74-db15-40d3-9e5a-fa66061ec3bb","Type":"ContainerDied","Data":"4abee1e9515cf4aa783dd0bd5e8bb2632b616a4cb74614e22db148428e48f3aa"} Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.622341 4689 scope.go:117] "RemoveContainer" containerID="0e3de09508b036f7aff417affb92d290088f4676c282117a5adf1d8b786704a1" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.625161 4689 generic.go:334] "Generic (PLEG): container finished" podID="b8758a96-64ae-4c03-b392-5aa8c68cc641" containerID="fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23" exitCode=0 Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.625258 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"b8758a96-64ae-4c03-b392-5aa8c68cc641","Type":"ContainerDied","Data":"fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23"} Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.625295 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"b8758a96-64ae-4c03-b392-5aa8c68cc641","Type":"ContainerDied","Data":"91dca4fbe16cbc8f22bd82b5b3d4386c8f4e61b3f5c4be41296b365b7863fa0e"} Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.625395 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.626464 4689 generic.go:334] "Generic (PLEG): container finished" podID="bd8b0d1d-32da-409d-9453-bef0c8ca65f1" containerID="69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5" exitCode=0 Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.626574 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-vr96j" event={"ID":"bd8b0d1d-32da-409d-9453-bef0c8ca65f1","Type":"ContainerDied","Data":"69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5"} Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.626602 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-vr96j" event={"ID":"bd8b0d1d-32da-409d-9453-bef0c8ca65f1","Type":"ContainerDied","Data":"1b5523618aff16ba392613154f43436eb9653012922a39cfa69da84f1170d8c9"} Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.626603 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-vr96j" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.630639 4689 generic.go:334] "Generic (PLEG): container finished" podID="26e0bab4-0913-4193-bb07-8d1802eda6c0" containerID="a15b111de1ac2c83ca80e44c5c4ff7f0530be43f7923b13afab7da027b126d2f" exitCode=0 Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.630678 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"26e0bab4-0913-4193-bb07-8d1802eda6c0","Type":"ContainerDied","Data":"a15b111de1ac2c83ca80e44c5c4ff7f0530be43f7923b13afab7da027b126d2f"} Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.678265 4689 scope.go:117] "RemoveContainer" containerID="fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.688470 4689 reconciler_common.go:293] "Volume detached for volume \"pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b09f6a1-5c25-4617-8823-7fd7b58fc152\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.727890 4689 scope.go:117] "RemoveContainer" containerID="df73ae97f7f931ce51d921b931364eadb91b0bc93313e06219d09747cc840f0e" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.728994 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-vr96j"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.743813 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-index-vr96j"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.752352 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.755832 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.760195 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.764386 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5bd67dfbcc-6grpn"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.765373 4689 scope.go:117] "RemoveContainer" containerID="fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23" Mar 07 04:44:43 crc kubenswrapper[4689]: E0307 04:44:43.765677 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23\": container with ID starting with fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23 not found: ID does not exist" containerID="fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.765702 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23"} err="failed to get container status \"fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23\": rpc error: code = NotFound desc = could not find container \"fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23\": container with ID starting with fd843436ab0a4ae9a82f7cb532803c971a279f08f92c79c2de39d2fe8a972d23 not found: ID does not exist" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.765719 4689 scope.go:117] "RemoveContainer" containerID="df73ae97f7f931ce51d921b931364eadb91b0bc93313e06219d09747cc840f0e" Mar 07 04:44:43 crc kubenswrapper[4689]: E0307 04:44:43.765985 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df73ae97f7f931ce51d921b931364eadb91b0bc93313e06219d09747cc840f0e\": container with ID starting with df73ae97f7f931ce51d921b931364eadb91b0bc93313e06219d09747cc840f0e not found: ID does not exist" containerID="df73ae97f7f931ce51d921b931364eadb91b0bc93313e06219d09747cc840f0e" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.766021 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df73ae97f7f931ce51d921b931364eadb91b0bc93313e06219d09747cc840f0e"} err="failed to get container status \"df73ae97f7f931ce51d921b931364eadb91b0bc93313e06219d09747cc840f0e\": rpc error: code = NotFound desc = could not find container \"df73ae97f7f931ce51d921b931364eadb91b0bc93313e06219d09747cc840f0e\": container with ID starting with df73ae97f7f931ce51d921b931364eadb91b0bc93313e06219d09747cc840f0e not found: ID does not exist" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.766049 4689 scope.go:117] "RemoveContainer" containerID="69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.788334 4689 scope.go:117] "RemoveContainer" containerID="69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5" Mar 07 04:44:43 crc kubenswrapper[4689]: E0307 04:44:43.788645 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5\": container with ID starting with 69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5 not found: ID does not exist" containerID="69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.788671 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5"} err="failed to get container status \"69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5\": rpc error: code = NotFound desc = could not find container \"69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5\": container with ID starting with 69625fc3ec765217be7ece782ad3b2487af8df450575a9872f76821b6e9a50f5 not found: ID does not exist" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.835051 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.838267 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e7640b-0391-468f-b8d7-8d0078e52e5f" path="/var/lib/kubelet/pods/01e7640b-0391-468f-b8d7-8d0078e52e5f/volumes" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.839271 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8758a96-64ae-4c03-b392-5aa8c68cc641" path="/var/lib/kubelet/pods/b8758a96-64ae-4c03-b392-5aa8c68cc641/volumes" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.840003 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd8b0d1d-32da-409d-9453-bef0c8ca65f1" path="/var/lib/kubelet/pods/bd8b0d1d-32da-409d-9453-bef0c8ca65f1/volumes" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.841400 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41b2833-be4f-46a8-b1fb-7c244ac8530b" path="/var/lib/kubelet/pods/c41b2833-be4f-46a8-b1fb-7c244ac8530b/volumes" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.842107 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5573c74-db15-40d3-9e5a-fa66061ec3bb" path="/var/lib/kubelet/pods/c5573c74-db15-40d3-9e5a-fa66061ec3bb/volumes" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.850980 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f02cb0ce-c569-4668-bc73-142e3340935f" path="/var/lib/kubelet/pods/f02cb0ce-c569-4668-bc73-142e3340935f/volumes" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.941248 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-create-v6n8c"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.945997 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-create-v6n8c"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.960127 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-df65-account-create-update-tpgsg"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.965186 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystonedf65-account-delete-blxfv"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.968773 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-df65-account-create-update-tpgsg"] Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.991935 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-config-data-default\") pod \"26e0bab4-0913-4193-bb07-8d1802eda6c0\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.992383 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-operator-scripts\") pod \"26e0bab4-0913-4193-bb07-8d1802eda6c0\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.992504 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26e0bab4-0913-4193-bb07-8d1802eda6c0-config-data-generated\") pod \"26e0bab4-0913-4193-bb07-8d1802eda6c0\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.992595 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"26e0bab4-0913-4193-bb07-8d1802eda6c0\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.992666 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "26e0bab4-0913-4193-bb07-8d1802eda6c0" (UID: "26e0bab4-0913-4193-bb07-8d1802eda6c0"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.992784 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-kolla-config\") pod \"26e0bab4-0913-4193-bb07-8d1802eda6c0\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.992925 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9w2p\" (UniqueName: \"kubernetes.io/projected/26e0bab4-0913-4193-bb07-8d1802eda6c0-kube-api-access-w9w2p\") pod \"26e0bab4-0913-4193-bb07-8d1802eda6c0\" (UID: \"26e0bab4-0913-4193-bb07-8d1802eda6c0\") " Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.993126 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26e0bab4-0913-4193-bb07-8d1802eda6c0" (UID: "26e0bab4-0913-4193-bb07-8d1802eda6c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.993423 4689 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.993554 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.993651 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26e0bab4-0913-4193-bb07-8d1802eda6c0-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "26e0bab4-0913-4193-bb07-8d1802eda6c0" (UID: "26e0bab4-0913-4193-bb07-8d1802eda6c0"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.993845 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "26e0bab4-0913-4193-bb07-8d1802eda6c0" (UID: "26e0bab4-0913-4193-bb07-8d1802eda6c0"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:43 crc kubenswrapper[4689]: I0307 04:44:43.999336 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e0bab4-0913-4193-bb07-8d1802eda6c0-kube-api-access-w9w2p" (OuterVolumeSpecName: "kube-api-access-w9w2p") pod "26e0bab4-0913-4193-bb07-8d1802eda6c0" (UID: "26e0bab4-0913-4193-bb07-8d1802eda6c0"). InnerVolumeSpecName "kube-api-access-w9w2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.020646 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "26e0bab4-0913-4193-bb07-8d1802eda6c0" (UID: "26e0bab4-0913-4193-bb07-8d1802eda6c0"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.066346 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstack-galera-0" podUID="af13923a-66fb-409e-a32e-42b1837151fe" containerName="galera" containerID="cri-o://aa6e9e5b81519a2444b9b02197944b8195cde24c07f3c66e98b50d857210adf7" gracePeriod=26 Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.095031 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9w2p\" (UniqueName: \"kubernetes.io/projected/26e0bab4-0913-4193-bb07-8d1802eda6c0-kube-api-access-w9w2p\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.095064 4689 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26e0bab4-0913-4193-bb07-8d1802eda6c0-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.095090 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.095104 4689 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26e0bab4-0913-4193-bb07-8d1802eda6c0-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.110586 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.196876 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.284855 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" Mar 07 04:44:44 crc kubenswrapper[4689]: E0307 04:44:44.298731 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 07 04:44:44 crc kubenswrapper[4689]: E0307 04:44:44.298799 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts podName:71f5f795-049e-4dd3-b436-553b6f16e650 nodeName:}" failed. No retries permitted until 2026-03-07 04:44:48.298784737 +0000 UTC m=+1533.345168226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts") pod "keystonedf65-account-delete-blxfv" (UID: "71f5f795-049e-4dd3-b436-553b6f16e650") : configmap "openstack-scripts" not found Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.399617 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvp48\" (UniqueName: \"kubernetes.io/projected/71f5f795-049e-4dd3-b436-553b6f16e650-kube-api-access-hvp48\") pod \"71f5f795-049e-4dd3-b436-553b6f16e650\" (UID: \"71f5f795-049e-4dd3-b436-553b6f16e650\") " Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.399728 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts\") pod \"71f5f795-049e-4dd3-b436-553b6f16e650\" (UID: \"71f5f795-049e-4dd3-b436-553b6f16e650\") " Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.400781 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71f5f795-049e-4dd3-b436-553b6f16e650" (UID: "71f5f795-049e-4dd3-b436-553b6f16e650"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.406359 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71f5f795-049e-4dd3-b436-553b6f16e650-kube-api-access-hvp48" (OuterVolumeSpecName: "kube-api-access-hvp48") pod "71f5f795-049e-4dd3-b436-553b6f16e650" (UID: "71f5f795-049e-4dd3-b436-553b6f16e650"). InnerVolumeSpecName "kube-api-access-hvp48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.501229 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvp48\" (UniqueName: \"kubernetes.io/projected/71f5f795-049e-4dd3-b436-553b6f16e650-kube-api-access-hvp48\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.501274 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f5f795-049e-4dd3-b436-553b6f16e650-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.647608 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"26e0bab4-0913-4193-bb07-8d1802eda6c0","Type":"ContainerDied","Data":"48ed228fa3b2684e6f8d7ecf1f94f3cf99c2a0bfd42d5d5e5aadc92f1298682c"} Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.647682 4689 scope.go:117] "RemoveContainer" containerID="a15b111de1ac2c83ca80e44c5c4ff7f0530be43f7923b13afab7da027b126d2f" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.647943 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.654029 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.654063 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystonedf65-account-delete-blxfv" event={"ID":"71f5f795-049e-4dd3-b436-553b6f16e650","Type":"ContainerDied","Data":"0da80b5b428593d5db4d3884861c2660274a6410e52cf196218d53133d569689"} Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.658857 4689 generic.go:334] "Generic (PLEG): container finished" podID="af13923a-66fb-409e-a32e-42b1837151fe" containerID="aa6e9e5b81519a2444b9b02197944b8195cde24c07f3c66e98b50d857210adf7" exitCode=0 Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.658926 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"af13923a-66fb-409e-a32e-42b1837151fe","Type":"ContainerDied","Data":"aa6e9e5b81519a2444b9b02197944b8195cde24c07f3c66e98b50d857210adf7"} Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.692399 4689 scope.go:117] "RemoveContainer" containerID="815152161e0a7dded51afa61b167ea2c378366e12438620ee58b3bfcceae4ed6" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.696712 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.705973 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.722233 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystonedf65-account-delete-blxfv"] Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.727089 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystonedf65-account-delete-blxfv"] Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.766756 4689 scope.go:117] "RemoveContainer" containerID="aab256129d20879de068e6dc30461887f7d5802f010964b5481f8b1b2cec85c2" Mar 07 04:44:44 crc kubenswrapper[4689]: I0307 04:44:44.851690 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.006978 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/af13923a-66fb-409e-a32e-42b1837151fe-config-data-generated\") pod \"af13923a-66fb-409e-a32e-42b1837151fe\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.007085 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-operator-scripts\") pod \"af13923a-66fb-409e-a32e-42b1837151fe\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.007116 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbx2f\" (UniqueName: \"kubernetes.io/projected/af13923a-66fb-409e-a32e-42b1837151fe-kube-api-access-gbx2f\") pod \"af13923a-66fb-409e-a32e-42b1837151fe\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.007200 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-kolla-config\") pod \"af13923a-66fb-409e-a32e-42b1837151fe\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.007237 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"af13923a-66fb-409e-a32e-42b1837151fe\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.007273 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-config-data-default\") pod \"af13923a-66fb-409e-a32e-42b1837151fe\" (UID: \"af13923a-66fb-409e-a32e-42b1837151fe\") " Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.007625 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af13923a-66fb-409e-a32e-42b1837151fe-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "af13923a-66fb-409e-a32e-42b1837151fe" (UID: "af13923a-66fb-409e-a32e-42b1837151fe"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.007962 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "af13923a-66fb-409e-a32e-42b1837151fe" (UID: "af13923a-66fb-409e-a32e-42b1837151fe"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.007995 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af13923a-66fb-409e-a32e-42b1837151fe" (UID: "af13923a-66fb-409e-a32e-42b1837151fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.008156 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "af13923a-66fb-409e-a32e-42b1837151fe" (UID: "af13923a-66fb-409e-a32e-42b1837151fe"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.010815 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af13923a-66fb-409e-a32e-42b1837151fe-kube-api-access-gbx2f" (OuterVolumeSpecName: "kube-api-access-gbx2f") pod "af13923a-66fb-409e-a32e-42b1837151fe" (UID: "af13923a-66fb-409e-a32e-42b1837151fe"). InnerVolumeSpecName "kube-api-access-gbx2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.015955 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "af13923a-66fb-409e-a32e-42b1837151fe" (UID: "af13923a-66fb-409e-a32e-42b1837151fe"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.108482 4689 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.108527 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbx2f\" (UniqueName: \"kubernetes.io/projected/af13923a-66fb-409e-a32e-42b1837151fe-kube-api-access-gbx2f\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.108538 4689 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.108575 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.108586 4689 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/af13923a-66fb-409e-a32e-42b1837151fe-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.108594 4689 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/af13923a-66fb-409e-a32e-42b1837151fe-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.119850 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.210692 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.679872 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"af13923a-66fb-409e-a32e-42b1837151fe","Type":"ContainerDied","Data":"1c184d30f5f37ca58e2bf0d51c5ce53307e7afbcc6f63afedaf1fde55a8c6acf"} Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.679979 4689 scope.go:117] "RemoveContainer" containerID="aa6e9e5b81519a2444b9b02197944b8195cde24c07f3c66e98b50d857210adf7" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.679997 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.708463 4689 scope.go:117] "RemoveContainer" containerID="9964af22e9987fe53552277762e5029819ae340630713a219d90389162debb35" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.737511 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.754712 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.838103 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e0bab4-0913-4193-bb07-8d1802eda6c0" path="/var/lib/kubelet/pods/26e0bab4-0913-4193-bb07-8d1802eda6c0/volumes" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.839062 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71f5f795-049e-4dd3-b436-553b6f16e650" path="/var/lib/kubelet/pods/71f5f795-049e-4dd3-b436-553b6f16e650/volumes" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.840052 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af13923a-66fb-409e-a32e-42b1837151fe" path="/var/lib/kubelet/pods/af13923a-66fb-409e-a32e-42b1837151fe/volumes" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.842307 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3c0e0af-98ed-4f4b-a406-d883afe0395b" path="/var/lib/kubelet/pods/c3c0e0af-98ed-4f4b-a406-d883afe0395b/volumes" Mar 07 04:44:45 crc kubenswrapper[4689]: I0307 04:44:45.843664 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06beb76-5cac-4ccf-9478-9dcb7ba03aee" path="/var/lib/kubelet/pods/d06beb76-5cac-4ccf-9478-9dcb7ba03aee/volumes" Mar 07 04:44:47 crc kubenswrapper[4689]: I0307 04:44:47.431591 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4"] Mar 07 04:44:47 crc kubenswrapper[4689]: I0307 04:44:47.432140 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" podUID="096a01ec-b76b-4553-aa1b-91b0282c3470" containerName="manager" containerID="cri-o://96c0d27aae27223b99c7a936596e5ce31f081a158d76a7845e6c32ffb2829466" gracePeriod=10 Mar 07 04:44:47 crc kubenswrapper[4689]: I0307 04:44:47.693031 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-5pqgx"] Mar 07 04:44:47 crc kubenswrapper[4689]: I0307 04:44:47.693282 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-index-5pqgx" podUID="7eb6e990-66c3-471d-b9b7-8a82f5652638" containerName="registry-server" containerID="cri-o://b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6" gracePeriod=30 Mar 07 04:44:47 crc kubenswrapper[4689]: I0307 04:44:47.706005 4689 generic.go:334] "Generic (PLEG): container finished" podID="096a01ec-b76b-4553-aa1b-91b0282c3470" containerID="96c0d27aae27223b99c7a936596e5ce31f081a158d76a7845e6c32ffb2829466" exitCode=0 Mar 07 04:44:47 crc kubenswrapper[4689]: I0307 04:44:47.706053 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" event={"ID":"096a01ec-b76b-4553-aa1b-91b0282c3470","Type":"ContainerDied","Data":"96c0d27aae27223b99c7a936596e5ce31f081a158d76a7845e6c32ffb2829466"} Mar 07 04:44:47 crc kubenswrapper[4689]: I0307 04:44:47.757235 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft"] Mar 07 04:44:47 crc kubenswrapper[4689]: I0307 04:44:47.763200 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9f79a44f21654cfe31d24b641097daf8fb6883ab71742c1395b1c2969ec4fft"] Mar 07 04:44:47 crc kubenswrapper[4689]: I0307 04:44:47.835022 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a1d244-2c54-45fa-af37-993bb70ec9ed" path="/var/lib/kubelet/pods/69a1d244-2c54-45fa-af37-993bb70ec9ed/volumes" Mar 07 04:44:47 crc kubenswrapper[4689]: I0307 04:44:47.920728 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.055042 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/096a01ec-b76b-4553-aa1b-91b0282c3470-apiservice-cert\") pod \"096a01ec-b76b-4553-aa1b-91b0282c3470\" (UID: \"096a01ec-b76b-4553-aa1b-91b0282c3470\") " Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.055475 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/096a01ec-b76b-4553-aa1b-91b0282c3470-webhook-cert\") pod \"096a01ec-b76b-4553-aa1b-91b0282c3470\" (UID: \"096a01ec-b76b-4553-aa1b-91b0282c3470\") " Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.055505 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpt2n\" (UniqueName: \"kubernetes.io/projected/096a01ec-b76b-4553-aa1b-91b0282c3470-kube-api-access-fpt2n\") pod \"096a01ec-b76b-4553-aa1b-91b0282c3470\" (UID: \"096a01ec-b76b-4553-aa1b-91b0282c3470\") " Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.060723 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/096a01ec-b76b-4553-aa1b-91b0282c3470-kube-api-access-fpt2n" (OuterVolumeSpecName: "kube-api-access-fpt2n") pod "096a01ec-b76b-4553-aa1b-91b0282c3470" (UID: "096a01ec-b76b-4553-aa1b-91b0282c3470"). InnerVolumeSpecName "kube-api-access-fpt2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.061797 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/096a01ec-b76b-4553-aa1b-91b0282c3470-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "096a01ec-b76b-4553-aa1b-91b0282c3470" (UID: "096a01ec-b76b-4553-aa1b-91b0282c3470"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.088523 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/096a01ec-b76b-4553-aa1b-91b0282c3470-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "096a01ec-b76b-4553-aa1b-91b0282c3470" (UID: "096a01ec-b76b-4553-aa1b-91b0282c3470"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.116259 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-5pqgx" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.156800 4689 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/096a01ec-b76b-4553-aa1b-91b0282c3470-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.156852 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpt2n\" (UniqueName: \"kubernetes.io/projected/096a01ec-b76b-4553-aa1b-91b0282c3470-kube-api-access-fpt2n\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.156862 4689 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/096a01ec-b76b-4553-aa1b-91b0282c3470-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.257704 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjpzf\" (UniqueName: \"kubernetes.io/projected/7eb6e990-66c3-471d-b9b7-8a82f5652638-kube-api-access-xjpzf\") pod \"7eb6e990-66c3-471d-b9b7-8a82f5652638\" (UID: \"7eb6e990-66c3-471d-b9b7-8a82f5652638\") " Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.262073 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb6e990-66c3-471d-b9b7-8a82f5652638-kube-api-access-xjpzf" (OuterVolumeSpecName: "kube-api-access-xjpzf") pod "7eb6e990-66c3-471d-b9b7-8a82f5652638" (UID: "7eb6e990-66c3-471d-b9b7-8a82f5652638"). InnerVolumeSpecName "kube-api-access-xjpzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.359665 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjpzf\" (UniqueName: \"kubernetes.io/projected/7eb6e990-66c3-471d-b9b7-8a82f5652638-kube-api-access-xjpzf\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.717418 4689 generic.go:334] "Generic (PLEG): container finished" podID="7eb6e990-66c3-471d-b9b7-8a82f5652638" containerID="b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6" exitCode=0 Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.717533 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-5pqgx" event={"ID":"7eb6e990-66c3-471d-b9b7-8a82f5652638","Type":"ContainerDied","Data":"b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6"} Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.717591 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-5pqgx" event={"ID":"7eb6e990-66c3-471d-b9b7-8a82f5652638","Type":"ContainerDied","Data":"f8476293da548218427748dd425ef2ab582da898c20856eb903f762a7b2613b4"} Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.717506 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-5pqgx" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.717618 4689 scope.go:117] "RemoveContainer" containerID="b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.725446 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" event={"ID":"096a01ec-b76b-4553-aa1b-91b0282c3470","Type":"ContainerDied","Data":"088a69d6c690f0e2371b123c9c6dd232095ac0343cfb407fbd14f4cb940cb2bf"} Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.725545 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.753316 4689 scope.go:117] "RemoveContainer" containerID="b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6" Mar 07 04:44:48 crc kubenswrapper[4689]: E0307 04:44:48.753967 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6\": container with ID starting with b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6 not found: ID does not exist" containerID="b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.754062 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6"} err="failed to get container status \"b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6\": rpc error: code = NotFound desc = could not find container \"b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6\": container with ID starting with b33b076f84eea17313178fabf2909522c0119d3a2808a1315c51bd5af23c00f6 not found: ID does not exist" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.754103 4689 scope.go:117] "RemoveContainer" containerID="96c0d27aae27223b99c7a936596e5ce31f081a158d76a7845e6c32ffb2829466" Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.775826 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-5pqgx"] Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.787290 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-index-5pqgx"] Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.796192 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4"] Mar 07 04:44:48 crc kubenswrapper[4689]: I0307 04:44:48.801237 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-controller-manager-744fbfddcd-gk4h4"] Mar 07 04:44:49 crc kubenswrapper[4689]: I0307 04:44:49.862668 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="096a01ec-b76b-4553-aa1b-91b0282c3470" path="/var/lib/kubelet/pods/096a01ec-b76b-4553-aa1b-91b0282c3470/volumes" Mar 07 04:44:49 crc kubenswrapper[4689]: I0307 04:44:49.864053 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eb6e990-66c3-471d-b9b7-8a82f5652638" path="/var/lib/kubelet/pods/7eb6e990-66c3-471d-b9b7-8a82f5652638/volumes" Mar 07 04:44:51 crc kubenswrapper[4689]: I0307 04:44:51.863362 4689 scope.go:117] "RemoveContainer" containerID="0ca506acd66aa2400e68ab25a7f08f12b730eb5e8e327076583942821ce53fb6" Mar 07 04:44:51 crc kubenswrapper[4689]: I0307 04:44:51.913827 4689 scope.go:117] "RemoveContainer" containerID="8f93665d9f72c0f3b18b3ed33085c1482efa697b8936866087d3e21cda14670b" Mar 07 04:44:51 crc kubenswrapper[4689]: I0307 04:44:51.969687 4689 scope.go:117] "RemoveContainer" containerID="a93bf6538071e1db6db606ad71151582e0647e5a78cac50834314780bfbd763c" Mar 07 04:44:51 crc kubenswrapper[4689]: I0307 04:44:51.990597 4689 scope.go:117] "RemoveContainer" containerID="ad307d5a4ec8c0a23c00d2d81ea49346a33388014013494e5a8093558af45270" Mar 07 04:44:52 crc kubenswrapper[4689]: I0307 04:44:52.026503 4689 scope.go:117] "RemoveContainer" containerID="1da4067e5102555aac456769b1b0ba6f41f94a68c76beb712b50860008fa6a25" Mar 07 04:44:52 crc kubenswrapper[4689]: I0307 04:44:52.046409 4689 scope.go:117] "RemoveContainer" containerID="beedd200283e3ddcf5f321d56d2129a80e698420f1bfbff4e065fa4516a71d84" Mar 07 04:44:52 crc kubenswrapper[4689]: I0307 04:44:52.064333 4689 scope.go:117] "RemoveContainer" containerID="4ec70c5fbe89650f3d3a689ac9370df57fc44f31c6efc015c2f7f2d1d4648fbf" Mar 07 04:44:52 crc kubenswrapper[4689]: I0307 04:44:52.093970 4689 scope.go:117] "RemoveContainer" containerID="6d84c35f61707c7b38c0f7abfad7b43658ceb65dc79a2d9b8c4aaeb55210b97f" Mar 07 04:44:52 crc kubenswrapper[4689]: I0307 04:44:52.108923 4689 scope.go:117] "RemoveContainer" containerID="3375165fd607e9989ebac042353510dc2d585b80cbb826cc8af79ab15a85781e" Mar 07 04:44:52 crc kubenswrapper[4689]: I0307 04:44:52.128251 4689 scope.go:117] "RemoveContainer" containerID="448ee8c5bed33702fd3fa184542ce564afd4bf7c755e19d80750618f6cec259e" Mar 07 04:44:52 crc kubenswrapper[4689]: I0307 04:44:52.146045 4689 scope.go:117] "RemoveContainer" containerID="1553062b35e6fce92d9ceca1f370c71e00fe127c25f80f1a4887b6403e3a193c" Mar 07 04:44:52 crc kubenswrapper[4689]: I0307 04:44:52.159103 4689 scope.go:117] "RemoveContainer" containerID="acab0757368f55b51224f719dbdc35614a84224ef9dbaec25cc43618dc55b963" Mar 07 04:44:52 crc kubenswrapper[4689]: I0307 04:44:52.174652 4689 scope.go:117] "RemoveContainer" containerID="88b647131cc1a5f8296c7683f24cdf754aa521e804e9978b5aa0d333118b0b93" Mar 07 04:44:52 crc kubenswrapper[4689]: I0307 04:44:52.188982 4689 scope.go:117] "RemoveContainer" containerID="91a644c2b79c56d56175895a1ae32eaf0c816e2a1de75c928b161c4dd0001e47" Mar 07 04:44:53 crc kubenswrapper[4689]: I0307 04:44:53.590767 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v"] Mar 07 04:44:53 crc kubenswrapper[4689]: I0307 04:44:53.590970 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" podUID="2fbf4774-52d5-49ff-8066-d6363f88c3c5" containerName="manager" containerID="cri-o://43dc6c8c5209facffc79f545ef744c2506053398b1e14289fb2ddb4ed03525e5" gracePeriod=10 Mar 07 04:44:53 crc kubenswrapper[4689]: I0307 04:44:53.784365 4689 generic.go:334] "Generic (PLEG): container finished" podID="2fbf4774-52d5-49ff-8066-d6363f88c3c5" containerID="43dc6c8c5209facffc79f545ef744c2506053398b1e14289fb2ddb4ed03525e5" exitCode=0 Mar 07 04:44:53 crc kubenswrapper[4689]: I0307 04:44:53.784446 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" event={"ID":"2fbf4774-52d5-49ff-8066-d6363f88c3c5","Type":"ContainerDied","Data":"43dc6c8c5209facffc79f545ef744c2506053398b1e14289fb2ddb4ed03525e5"} Mar 07 04:44:53 crc kubenswrapper[4689]: I0307 04:44:53.846991 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-dtt2x"] Mar 07 04:44:53 crc kubenswrapper[4689]: I0307 04:44:53.847241 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-dtt2x" podUID="c0bfd96e-646a-4e38-bcd8-c77623fea007" containerName="registry-server" containerID="cri-o://a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1" gracePeriod=30 Mar 07 04:44:53 crc kubenswrapper[4689]: I0307 04:44:53.877558 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9"] Mar 07 04:44:53 crc kubenswrapper[4689]: I0307 04:44:53.883783 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vkfm9"] Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.097034 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.255651 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2fbf4774-52d5-49ff-8066-d6363f88c3c5-apiservice-cert\") pod \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\" (UID: \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\") " Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.255772 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2fbf4774-52d5-49ff-8066-d6363f88c3c5-webhook-cert\") pod \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\" (UID: \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\") " Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.255898 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w859q\" (UniqueName: \"kubernetes.io/projected/2fbf4774-52d5-49ff-8066-d6363f88c3c5-kube-api-access-w859q\") pod \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\" (UID: \"2fbf4774-52d5-49ff-8066-d6363f88c3c5\") " Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.262571 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbf4774-52d5-49ff-8066-d6363f88c3c5-kube-api-access-w859q" (OuterVolumeSpecName: "kube-api-access-w859q") pod "2fbf4774-52d5-49ff-8066-d6363f88c3c5" (UID: "2fbf4774-52d5-49ff-8066-d6363f88c3c5"). InnerVolumeSpecName "kube-api-access-w859q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.263111 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-dtt2x" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.264817 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fbf4774-52d5-49ff-8066-d6363f88c3c5-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "2fbf4774-52d5-49ff-8066-d6363f88c3c5" (UID: "2fbf4774-52d5-49ff-8066-d6363f88c3c5"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.264782 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fbf4774-52d5-49ff-8066-d6363f88c3c5-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "2fbf4774-52d5-49ff-8066-d6363f88c3c5" (UID: "2fbf4774-52d5-49ff-8066-d6363f88c3c5"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.357457 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g55n7\" (UniqueName: \"kubernetes.io/projected/c0bfd96e-646a-4e38-bcd8-c77623fea007-kube-api-access-g55n7\") pod \"c0bfd96e-646a-4e38-bcd8-c77623fea007\" (UID: \"c0bfd96e-646a-4e38-bcd8-c77623fea007\") " Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.357776 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w859q\" (UniqueName: \"kubernetes.io/projected/2fbf4774-52d5-49ff-8066-d6363f88c3c5-kube-api-access-w859q\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.357794 4689 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2fbf4774-52d5-49ff-8066-d6363f88c3c5-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.357802 4689 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2fbf4774-52d5-49ff-8066-d6363f88c3c5-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.360516 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0bfd96e-646a-4e38-bcd8-c77623fea007-kube-api-access-g55n7" (OuterVolumeSpecName: "kube-api-access-g55n7") pod "c0bfd96e-646a-4e38-bcd8-c77623fea007" (UID: "c0bfd96e-646a-4e38-bcd8-c77623fea007"). InnerVolumeSpecName "kube-api-access-g55n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.459469 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g55n7\" (UniqueName: \"kubernetes.io/projected/c0bfd96e-646a-4e38-bcd8-c77623fea007-kube-api-access-g55n7\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.794894 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.794925 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v" event={"ID":"2fbf4774-52d5-49ff-8066-d6363f88c3c5","Type":"ContainerDied","Data":"ccdc84d8b04dc030ad4ec723fbe5fc92629b27c5512bd0e70665c7bb8816d240"} Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.795375 4689 scope.go:117] "RemoveContainer" containerID="43dc6c8c5209facffc79f545ef744c2506053398b1e14289fb2ddb4ed03525e5" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.798290 4689 generic.go:334] "Generic (PLEG): container finished" podID="c0bfd96e-646a-4e38-bcd8-c77623fea007" containerID="a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1" exitCode=0 Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.798339 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-dtt2x" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.798345 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-dtt2x" event={"ID":"c0bfd96e-646a-4e38-bcd8-c77623fea007","Type":"ContainerDied","Data":"a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1"} Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.798384 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-dtt2x" event={"ID":"c0bfd96e-646a-4e38-bcd8-c77623fea007","Type":"ContainerDied","Data":"8ea83a2121fad5178e4c5403b2378339092453436159ee1d229ad3c08b253d22"} Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.824464 4689 scope.go:117] "RemoveContainer" containerID="a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.845917 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-dtt2x"] Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.857620 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-dtt2x"] Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.858580 4689 scope.go:117] "RemoveContainer" containerID="a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1" Mar 07 04:44:54 crc kubenswrapper[4689]: E0307 04:44:54.859254 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1\": container with ID starting with a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1 not found: ID does not exist" containerID="a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.859301 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1"} err="failed to get container status \"a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1\": rpc error: code = NotFound desc = could not find container \"a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1\": container with ID starting with a3206cdd9a5f218bc202bb8b4acce1ccca2884ca082edffb1cb8ce8cca0ef7f1 not found: ID does not exist" Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.866065 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v"] Mar 07 04:44:54 crc kubenswrapper[4689]: I0307 04:44:54.871184 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c6bb6574-hcn5v"] Mar 07 04:44:55 crc kubenswrapper[4689]: I0307 04:44:55.655882 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx"] Mar 07 04:44:55 crc kubenswrapper[4689]: I0307 04:44:55.656103 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" podUID="eaa03e1b-1007-4b01-9753-7c0ffa27b09c" containerName="operator" containerID="cri-o://cfdab96e6f02eeef3b9343183918f4011d67b29e8ce0838ff558df8dae482a4b" gracePeriod=10 Mar 07 04:44:55 crc kubenswrapper[4689]: I0307 04:44:55.831100 4689 generic.go:334] "Generic (PLEG): container finished" podID="eaa03e1b-1007-4b01-9753-7c0ffa27b09c" containerID="cfdab96e6f02eeef3b9343183918f4011d67b29e8ce0838ff558df8dae482a4b" exitCode=0 Mar 07 04:44:55 crc kubenswrapper[4689]: I0307 04:44:55.842472 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fbf4774-52d5-49ff-8066-d6363f88c3c5" path="/var/lib/kubelet/pods/2fbf4774-52d5-49ff-8066-d6363f88c3c5/volumes" Mar 07 04:44:55 crc kubenswrapper[4689]: I0307 04:44:55.843115 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="789a8575-312c-4ce6-96e5-ccc4c7f8373f" path="/var/lib/kubelet/pods/789a8575-312c-4ce6-96e5-ccc4c7f8373f/volumes" Mar 07 04:44:55 crc kubenswrapper[4689]: I0307 04:44:55.843721 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0bfd96e-646a-4e38-bcd8-c77623fea007" path="/var/lib/kubelet/pods/c0bfd96e-646a-4e38-bcd8-c77623fea007/volumes" Mar 07 04:44:55 crc kubenswrapper[4689]: I0307 04:44:55.844108 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" event={"ID":"eaa03e1b-1007-4b01-9753-7c0ffa27b09c","Type":"ContainerDied","Data":"cfdab96e6f02eeef3b9343183918f4011d67b29e8ce0838ff558df8dae482a4b"} Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.014582 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-mtqqp"] Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.014757 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" podUID="05c84021-6716-4d39-ab89-1cea45f77a64" containerName="registry-server" containerID="cri-o://8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247" gracePeriod=30 Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.040223 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9"] Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.044594 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590phgd9"] Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.122956 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.282708 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tnqb\" (UniqueName: \"kubernetes.io/projected/eaa03e1b-1007-4b01-9753-7c0ffa27b09c-kube-api-access-7tnqb\") pod \"eaa03e1b-1007-4b01-9753-7c0ffa27b09c\" (UID: \"eaa03e1b-1007-4b01-9753-7c0ffa27b09c\") " Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.321740 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa03e1b-1007-4b01-9753-7c0ffa27b09c-kube-api-access-7tnqb" (OuterVolumeSpecName: "kube-api-access-7tnqb") pod "eaa03e1b-1007-4b01-9753-7c0ffa27b09c" (UID: "eaa03e1b-1007-4b01-9753-7c0ffa27b09c"). InnerVolumeSpecName "kube-api-access-7tnqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.385085 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tnqb\" (UniqueName: \"kubernetes.io/projected/eaa03e1b-1007-4b01-9753-7c0ffa27b09c-kube-api-access-7tnqb\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.500595 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.587988 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgp79\" (UniqueName: \"kubernetes.io/projected/05c84021-6716-4d39-ab89-1cea45f77a64-kube-api-access-fgp79\") pod \"05c84021-6716-4d39-ab89-1cea45f77a64\" (UID: \"05c84021-6716-4d39-ab89-1cea45f77a64\") " Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.591271 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05c84021-6716-4d39-ab89-1cea45f77a64-kube-api-access-fgp79" (OuterVolumeSpecName: "kube-api-access-fgp79") pod "05c84021-6716-4d39-ab89-1cea45f77a64" (UID: "05c84021-6716-4d39-ab89-1cea45f77a64"). InnerVolumeSpecName "kube-api-access-fgp79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.689746 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgp79\" (UniqueName: \"kubernetes.io/projected/05c84021-6716-4d39-ab89-1cea45f77a64-kube-api-access-fgp79\") on node \"crc\" DevicePath \"\"" Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.849363 4689 generic.go:334] "Generic (PLEG): container finished" podID="05c84021-6716-4d39-ab89-1cea45f77a64" containerID="8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247" exitCode=0 Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.849436 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" event={"ID":"05c84021-6716-4d39-ab89-1cea45f77a64","Type":"ContainerDied","Data":"8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247"} Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.849436 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.849708 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-mtqqp" event={"ID":"05c84021-6716-4d39-ab89-1cea45f77a64","Type":"ContainerDied","Data":"3accee3638adb30f1fa6296babcd1d47f10edc16066a2e2584670f0a7faa0332"} Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.849747 4689 scope.go:117] "RemoveContainer" containerID="8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247" Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.851046 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" event={"ID":"eaa03e1b-1007-4b01-9753-7c0ffa27b09c","Type":"ContainerDied","Data":"4e8c8445e8a576e4ef56a357206f21f553a14c8f1b7a7b3a48c01309f05bef87"} Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.851087 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx" Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.866636 4689 scope.go:117] "RemoveContainer" containerID="8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247" Mar 07 04:44:56 crc kubenswrapper[4689]: E0307 04:44:56.867369 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247\": container with ID starting with 8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247 not found: ID does not exist" containerID="8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247" Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.867414 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247"} err="failed to get container status \"8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247\": rpc error: code = NotFound desc = could not find container \"8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247\": container with ID starting with 8d96b90eefc5dea6b731a20bcb0187f602099cd8fd82f95944a33ee7539f4247 not found: ID does not exist" Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.867439 4689 scope.go:117] "RemoveContainer" containerID="cfdab96e6f02eeef3b9343183918f4011d67b29e8ce0838ff558df8dae482a4b" Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.892395 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx"] Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.898427 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-9mvxx"] Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.902571 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-mtqqp"] Mar 07 04:44:56 crc kubenswrapper[4689]: I0307 04:44:56.908520 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-mtqqp"] Mar 07 04:44:57 crc kubenswrapper[4689]: I0307 04:44:57.858159 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05c84021-6716-4d39-ab89-1cea45f77a64" path="/var/lib/kubelet/pods/05c84021-6716-4d39-ab89-1cea45f77a64/volumes" Mar 07 04:44:57 crc kubenswrapper[4689]: I0307 04:44:57.859391 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1de71b3a-96bf-445c-b321-5f0d10b77523" path="/var/lib/kubelet/pods/1de71b3a-96bf-445c-b321-5f0d10b77523/volumes" Mar 07 04:44:57 crc kubenswrapper[4689]: I0307 04:44:57.860676 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaa03e1b-1007-4b01-9753-7c0ffa27b09c" path="/var/lib/kubelet/pods/eaa03e1b-1007-4b01-9753-7c0ffa27b09c/volumes" Mar 07 04:44:59 crc kubenswrapper[4689]: I0307 04:44:59.189766 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:44:59 crc kubenswrapper[4689]: I0307 04:44:59.190292 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144304 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn"] Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144580 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af13923a-66fb-409e-a32e-42b1837151fe" containerName="galera" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144594 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="af13923a-66fb-409e-a32e-42b1837151fe" containerName="galera" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144607 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa03e1b-1007-4b01-9753-7c0ffa27b09c" containerName="operator" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144614 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa03e1b-1007-4b01-9753-7c0ffa27b09c" containerName="operator" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144626 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbf4774-52d5-49ff-8066-d6363f88c3c5" containerName="manager" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144636 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbf4774-52d5-49ff-8066-d6363f88c3c5" containerName="manager" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144646 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="096a01ec-b76b-4553-aa1b-91b0282c3470" containerName="manager" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144653 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="096a01ec-b76b-4553-aa1b-91b0282c3470" containerName="manager" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144665 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e0bab4-0913-4193-bb07-8d1802eda6c0" containerName="mysql-bootstrap" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144673 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e0bab4-0913-4193-bb07-8d1802eda6c0" containerName="mysql-bootstrap" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144686 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb6e990-66c3-471d-b9b7-8a82f5652638" containerName="registry-server" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144693 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb6e990-66c3-471d-b9b7-8a82f5652638" containerName="registry-server" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144702 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243ddc02-c377-44ac-9b47-2240c3d9efed" containerName="mysql-bootstrap" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144709 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="243ddc02-c377-44ac-9b47-2240c3d9efed" containerName="mysql-bootstrap" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144722 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e7640b-0391-468f-b8d7-8d0078e52e5f" containerName="keystone-api" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144730 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e7640b-0391-468f-b8d7-8d0078e52e5f" containerName="keystone-api" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144745 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd8b0d1d-32da-409d-9453-bef0c8ca65f1" containerName="registry-server" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144752 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd8b0d1d-32da-409d-9453-bef0c8ca65f1" containerName="registry-server" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144764 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af13923a-66fb-409e-a32e-42b1837151fe" containerName="mysql-bootstrap" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144771 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="af13923a-66fb-409e-a32e-42b1837151fe" containerName="mysql-bootstrap" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144783 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8758a96-64ae-4c03-b392-5aa8c68cc641" containerName="setup-container" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144791 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8758a96-64ae-4c03-b392-5aa8c68cc641" containerName="setup-container" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144802 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8758a96-64ae-4c03-b392-5aa8c68cc641" containerName="rabbitmq" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144809 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8758a96-64ae-4c03-b392-5aa8c68cc641" containerName="rabbitmq" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144819 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41b2833-be4f-46a8-b1fb-7c244ac8530b" containerName="memcached" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144827 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41b2833-be4f-46a8-b1fb-7c244ac8530b" containerName="memcached" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144838 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0bfd96e-646a-4e38-bcd8-c77623fea007" containerName="registry-server" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144846 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bfd96e-646a-4e38-bcd8-c77623fea007" containerName="registry-server" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144858 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71f5f795-049e-4dd3-b436-553b6f16e650" containerName="mariadb-account-delete" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144866 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f5f795-049e-4dd3-b436-553b6f16e650" containerName="mariadb-account-delete" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144876 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71f5f795-049e-4dd3-b436-553b6f16e650" containerName="mariadb-account-delete" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144884 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f5f795-049e-4dd3-b436-553b6f16e650" containerName="mariadb-account-delete" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144900 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c84021-6716-4d39-ab89-1cea45f77a64" containerName="registry-server" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144908 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c84021-6716-4d39-ab89-1cea45f77a64" containerName="registry-server" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144920 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243ddc02-c377-44ac-9b47-2240c3d9efed" containerName="galera" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144928 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="243ddc02-c377-44ac-9b47-2240c3d9efed" containerName="galera" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144937 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e0bab4-0913-4193-bb07-8d1802eda6c0" containerName="galera" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144944 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e0bab4-0913-4193-bb07-8d1802eda6c0" containerName="galera" Mar 07 04:45:00 crc kubenswrapper[4689]: E0307 04:45:00.144955 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5573c74-db15-40d3-9e5a-fa66061ec3bb" containerName="manager" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.144961 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5573c74-db15-40d3-9e5a-fa66061ec3bb" containerName="manager" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145076 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa03e1b-1007-4b01-9753-7c0ffa27b09c" containerName="operator" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145094 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd8b0d1d-32da-409d-9453-bef0c8ca65f1" containerName="registry-server" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145103 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41b2833-be4f-46a8-b1fb-7c244ac8530b" containerName="memcached" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145111 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8758a96-64ae-4c03-b392-5aa8c68cc641" containerName="rabbitmq" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145119 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="243ddc02-c377-44ac-9b47-2240c3d9efed" containerName="galera" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145128 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5573c74-db15-40d3-9e5a-fa66061ec3bb" containerName="manager" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145140 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e0bab4-0913-4193-bb07-8d1802eda6c0" containerName="galera" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145148 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fbf4774-52d5-49ff-8066-d6363f88c3c5" containerName="manager" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145157 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="71f5f795-049e-4dd3-b436-553b6f16e650" containerName="mariadb-account-delete" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145230 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb6e990-66c3-471d-b9b7-8a82f5652638" containerName="registry-server" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145243 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0bfd96e-646a-4e38-bcd8-c77623fea007" containerName="registry-server" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145251 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="096a01ec-b76b-4553-aa1b-91b0282c3470" containerName="manager" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145260 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="71f5f795-049e-4dd3-b436-553b6f16e650" containerName="mariadb-account-delete" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145270 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="05c84021-6716-4d39-ab89-1cea45f77a64" containerName="registry-server" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145279 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="af13923a-66fb-409e-a32e-42b1837151fe" containerName="galera" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145288 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e7640b-0391-468f-b8d7-8d0078e52e5f" containerName="keystone-api" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.145771 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.147771 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.148044 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.159976 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn"] Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.241144 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-config-volume\") pod \"collect-profiles-29547645-5mfhn\" (UID: \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.241196 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-secret-volume\") pod \"collect-profiles-29547645-5mfhn\" (UID: \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.241232 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5bw8\" (UniqueName: \"kubernetes.io/projected/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-kube-api-access-g5bw8\") pod \"collect-profiles-29547645-5mfhn\" (UID: \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.342845 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5bw8\" (UniqueName: \"kubernetes.io/projected/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-kube-api-access-g5bw8\") pod \"collect-profiles-29547645-5mfhn\" (UID: \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.342976 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-config-volume\") pod \"collect-profiles-29547645-5mfhn\" (UID: \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.343001 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-secret-volume\") pod \"collect-profiles-29547645-5mfhn\" (UID: \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.343988 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-config-volume\") pod \"collect-profiles-29547645-5mfhn\" (UID: \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.357808 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-secret-volume\") pod \"collect-profiles-29547645-5mfhn\" (UID: \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.362387 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5bw8\" (UniqueName: \"kubernetes.io/projected/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-kube-api-access-g5bw8\") pod \"collect-profiles-29547645-5mfhn\" (UID: \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.470848 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.907840 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4"] Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.908804 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" podUID="9edd6ad0-247d-45f0-95e9-0291d649c6ec" containerName="manager" containerID="cri-o://fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d" gracePeriod=10 Mar 07 04:45:00 crc kubenswrapper[4689]: I0307 04:45:00.970064 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn"] Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.203668 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-rnf5j"] Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.204195 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-rnf5j" podUID="b07d06d3-554f-4c41-b001-e5d9338bbdf4" containerName="registry-server" containerID="cri-o://55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf" gracePeriod=30 Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.247841 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h"] Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.251666 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cksr7h"] Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.329694 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.461824 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d885l\" (UniqueName: \"kubernetes.io/projected/9edd6ad0-247d-45f0-95e9-0291d649c6ec-kube-api-access-d885l\") pod \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\" (UID: \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\") " Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.461896 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9edd6ad0-247d-45f0-95e9-0291d649c6ec-apiservice-cert\") pod \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\" (UID: \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\") " Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.461964 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9edd6ad0-247d-45f0-95e9-0291d649c6ec-webhook-cert\") pod \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\" (UID: \"9edd6ad0-247d-45f0-95e9-0291d649c6ec\") " Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.467050 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9edd6ad0-247d-45f0-95e9-0291d649c6ec-kube-api-access-d885l" (OuterVolumeSpecName: "kube-api-access-d885l") pod "9edd6ad0-247d-45f0-95e9-0291d649c6ec" (UID: "9edd6ad0-247d-45f0-95e9-0291d649c6ec"). InnerVolumeSpecName "kube-api-access-d885l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.467103 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edd6ad0-247d-45f0-95e9-0291d649c6ec-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "9edd6ad0-247d-45f0-95e9-0291d649c6ec" (UID: "9edd6ad0-247d-45f0-95e9-0291d649c6ec"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.470186 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edd6ad0-247d-45f0-95e9-0291d649c6ec-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "9edd6ad0-247d-45f0-95e9-0291d649c6ec" (UID: "9edd6ad0-247d-45f0-95e9-0291d649c6ec"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.563035 4689 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9edd6ad0-247d-45f0-95e9-0291d649c6ec-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.563072 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d885l\" (UniqueName: \"kubernetes.io/projected/9edd6ad0-247d-45f0-95e9-0291d649c6ec-kube-api-access-d885l\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.563085 4689 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9edd6ad0-247d-45f0-95e9-0291d649c6ec-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.652998 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-rnf5j" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.765437 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrkkm\" (UniqueName: \"kubernetes.io/projected/b07d06d3-554f-4c41-b001-e5d9338bbdf4-kube-api-access-qrkkm\") pod \"b07d06d3-554f-4c41-b001-e5d9338bbdf4\" (UID: \"b07d06d3-554f-4c41-b001-e5d9338bbdf4\") " Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.780446 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b07d06d3-554f-4c41-b001-e5d9338bbdf4-kube-api-access-qrkkm" (OuterVolumeSpecName: "kube-api-access-qrkkm") pod "b07d06d3-554f-4c41-b001-e5d9338bbdf4" (UID: "b07d06d3-554f-4c41-b001-e5d9338bbdf4"). InnerVolumeSpecName "kube-api-access-qrkkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.836347 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62285b0a-fa87-4b64-b313-62f820cc9467" path="/var/lib/kubelet/pods/62285b0a-fa87-4b64-b313-62f820cc9467/volumes" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.867815 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrkkm\" (UniqueName: \"kubernetes.io/projected/b07d06d3-554f-4c41-b001-e5d9338bbdf4-kube-api-access-qrkkm\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.920967 4689 generic.go:334] "Generic (PLEG): container finished" podID="9edd6ad0-247d-45f0-95e9-0291d649c6ec" containerID="fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d" exitCode=0 Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.921055 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.921057 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" event={"ID":"9edd6ad0-247d-45f0-95e9-0291d649c6ec","Type":"ContainerDied","Data":"fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d"} Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.921209 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4" event={"ID":"9edd6ad0-247d-45f0-95e9-0291d649c6ec","Type":"ContainerDied","Data":"d1bfc89c10713d5e6c4965888b82c56529c05dd4bb152d98f8d99bef100c9bc4"} Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.921238 4689 scope.go:117] "RemoveContainer" containerID="fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.923219 4689 generic.go:334] "Generic (PLEG): container finished" podID="b07d06d3-554f-4c41-b001-e5d9338bbdf4" containerID="55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf" exitCode=0 Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.923282 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-rnf5j" event={"ID":"b07d06d3-554f-4c41-b001-e5d9338bbdf4","Type":"ContainerDied","Data":"55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf"} Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.923303 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-rnf5j" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.923323 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-rnf5j" event={"ID":"b07d06d3-554f-4c41-b001-e5d9338bbdf4","Type":"ContainerDied","Data":"447a44482c07136c3563632351627ce42e19ecddc6fcd3fade3598bc082c8ec0"} Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.926375 4689 generic.go:334] "Generic (PLEG): container finished" podID="829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e" containerID="ec2849658af7d85f6e4810e8bb5a5cf2deb0a8c1acbf0435a58b7e5877b4771f" exitCode=0 Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.926400 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" event={"ID":"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e","Type":"ContainerDied","Data":"ec2849658af7d85f6e4810e8bb5a5cf2deb0a8c1acbf0435a58b7e5877b4771f"} Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.926416 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" event={"ID":"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e","Type":"ContainerStarted","Data":"079dd4c0a6f3bd2d17790d54432173248a3770b798520156ea25d7bad8e69043"} Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.942929 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4"] Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.943120 4689 scope.go:117] "RemoveContainer" containerID="fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d" Mar 07 04:45:01 crc kubenswrapper[4689]: E0307 04:45:01.944648 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d\": container with ID starting with fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d not found: ID does not exist" containerID="fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.944766 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d"} err="failed to get container status \"fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d\": rpc error: code = NotFound desc = could not find container \"fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d\": container with ID starting with fa945051588f28eb976befe0a9a537d1e7d95674810b9914a6c40734b620bb4d not found: ID does not exist" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.944825 4689 scope.go:117] "RemoveContainer" containerID="55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.947985 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9b75f4d4d-869m4"] Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.971051 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-rnf5j"] Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.975063 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-rnf5j"] Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.981815 4689 scope.go:117] "RemoveContainer" containerID="55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf" Mar 07 04:45:01 crc kubenswrapper[4689]: E0307 04:45:01.982476 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf\": container with ID starting with 55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf not found: ID does not exist" containerID="55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf" Mar 07 04:45:01 crc kubenswrapper[4689]: I0307 04:45:01.982510 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf"} err="failed to get container status \"55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf\": rpc error: code = NotFound desc = could not find container \"55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf\": container with ID starting with 55c54103a24296ac84d576e418a55f72d6cb1d4dbdd9a58c6cd6a44a190916bf not found: ID does not exist" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.047546 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr"] Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.048078 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" podUID="44ccc3e9-523e-49f4-a647-87bad23b837f" containerName="manager" containerID="cri-o://931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee" gracePeriod=10 Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.146974 4689 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" podUID="44ccc3e9-523e-49f4-a647-87bad23b837f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8081/readyz\": dial tcp 10.217.0.47:8081: connect: connection refused" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.281877 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-mzlbx"] Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.282118 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-mzlbx" podUID="ee3b71b3-32e5-46a2-9f3e-589e7da005a4" containerName="registry-server" containerID="cri-o://ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695" gracePeriod=30 Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.310853 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk"] Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.315674 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62xllnk"] Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.466591 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.518550 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.599021 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44ccc3e9-523e-49f4-a647-87bad23b837f-webhook-cert\") pod \"44ccc3e9-523e-49f4-a647-87bad23b837f\" (UID: \"44ccc3e9-523e-49f4-a647-87bad23b837f\") " Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.599060 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr5mq\" (UniqueName: \"kubernetes.io/projected/44ccc3e9-523e-49f4-a647-87bad23b837f-kube-api-access-wr5mq\") pod \"44ccc3e9-523e-49f4-a647-87bad23b837f\" (UID: \"44ccc3e9-523e-49f4-a647-87bad23b837f\") " Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.599121 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-config-volume\") pod \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\" (UID: \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\") " Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.599405 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-secret-volume\") pod \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\" (UID: \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\") " Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.599438 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5bw8\" (UniqueName: \"kubernetes.io/projected/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-kube-api-access-g5bw8\") pod \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\" (UID: \"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e\") " Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.599480 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44ccc3e9-523e-49f4-a647-87bad23b837f-apiservice-cert\") pod \"44ccc3e9-523e-49f4-a647-87bad23b837f\" (UID: \"44ccc3e9-523e-49f4-a647-87bad23b837f\") " Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.600023 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-config-volume" (OuterVolumeSpecName: "config-volume") pod "829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e" (UID: "829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.603780 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ccc3e9-523e-49f4-a647-87bad23b837f-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "44ccc3e9-523e-49f4-a647-87bad23b837f" (UID: "44ccc3e9-523e-49f4-a647-87bad23b837f"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.605849 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ccc3e9-523e-49f4-a647-87bad23b837f-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "44ccc3e9-523e-49f4-a647-87bad23b837f" (UID: "44ccc3e9-523e-49f4-a647-87bad23b837f"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.608252 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e" (UID: "829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.609396 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-kube-api-access-g5bw8" (OuterVolumeSpecName: "kube-api-access-g5bw8") pod "829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e" (UID: "829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e"). InnerVolumeSpecName "kube-api-access-g5bw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.609510 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ccc3e9-523e-49f4-a647-87bad23b837f-kube-api-access-wr5mq" (OuterVolumeSpecName: "kube-api-access-wr5mq") pod "44ccc3e9-523e-49f4-a647-87bad23b837f" (UID: "44ccc3e9-523e-49f4-a647-87bad23b837f"). InnerVolumeSpecName "kube-api-access-wr5mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.700943 4689 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44ccc3e9-523e-49f4-a647-87bad23b837f-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.700988 4689 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44ccc3e9-523e-49f4-a647-87bad23b837f-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.701003 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr5mq\" (UniqueName: \"kubernetes.io/projected/44ccc3e9-523e-49f4-a647-87bad23b837f-kube-api-access-wr5mq\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.701016 4689 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.701028 4689 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.701038 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5bw8\" (UniqueName: \"kubernetes.io/projected/829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e-kube-api-access-g5bw8\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.726315 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-mzlbx" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.801527 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss6k7\" (UniqueName: \"kubernetes.io/projected/ee3b71b3-32e5-46a2-9f3e-589e7da005a4-kube-api-access-ss6k7\") pod \"ee3b71b3-32e5-46a2-9f3e-589e7da005a4\" (UID: \"ee3b71b3-32e5-46a2-9f3e-589e7da005a4\") " Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.807278 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3b71b3-32e5-46a2-9f3e-589e7da005a4-kube-api-access-ss6k7" (OuterVolumeSpecName: "kube-api-access-ss6k7") pod "ee3b71b3-32e5-46a2-9f3e-589e7da005a4" (UID: "ee3b71b3-32e5-46a2-9f3e-589e7da005a4"). InnerVolumeSpecName "kube-api-access-ss6k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.832892 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99f93314-9b2f-4bac-90ac-20c44ed8b998" path="/var/lib/kubelet/pods/99f93314-9b2f-4bac-90ac-20c44ed8b998/volumes" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.833493 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9edd6ad0-247d-45f0-95e9-0291d649c6ec" path="/var/lib/kubelet/pods/9edd6ad0-247d-45f0-95e9-0291d649c6ec/volumes" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.834051 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b07d06d3-554f-4c41-b001-e5d9338bbdf4" path="/var/lib/kubelet/pods/b07d06d3-554f-4c41-b001-e5d9338bbdf4/volumes" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.903265 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss6k7\" (UniqueName: \"kubernetes.io/projected/ee3b71b3-32e5-46a2-9f3e-589e7da005a4-kube-api-access-ss6k7\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.946723 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" event={"ID":"829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e","Type":"ContainerDied","Data":"079dd4c0a6f3bd2d17790d54432173248a3770b798520156ea25d7bad8e69043"} Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.946752 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547645-5mfhn" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.946757 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="079dd4c0a6f3bd2d17790d54432173248a3770b798520156ea25d7bad8e69043" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.948436 4689 generic.go:334] "Generic (PLEG): container finished" podID="44ccc3e9-523e-49f4-a647-87bad23b837f" containerID="931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee" exitCode=0 Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.948494 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.948517 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" event={"ID":"44ccc3e9-523e-49f4-a647-87bad23b837f","Type":"ContainerDied","Data":"931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee"} Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.948547 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr" event={"ID":"44ccc3e9-523e-49f4-a647-87bad23b837f","Type":"ContainerDied","Data":"af4641fb206d395bc745a9599f9368d79e2a02f08c9f00e13f3c94db2e2a1bd0"} Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.948564 4689 scope.go:117] "RemoveContainer" containerID="931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.950441 4689 generic.go:334] "Generic (PLEG): container finished" podID="ee3b71b3-32e5-46a2-9f3e-589e7da005a4" containerID="ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695" exitCode=0 Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.950463 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-mzlbx" event={"ID":"ee3b71b3-32e5-46a2-9f3e-589e7da005a4","Type":"ContainerDied","Data":"ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695"} Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.950475 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-mzlbx" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.950489 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-mzlbx" event={"ID":"ee3b71b3-32e5-46a2-9f3e-589e7da005a4","Type":"ContainerDied","Data":"4c260547acbd1a6f12d9a180124b1fbb76177264802150fd79fb5ea023b010dc"} Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.966934 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr"] Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.968054 4689 scope.go:117] "RemoveContainer" containerID="931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee" Mar 07 04:45:03 crc kubenswrapper[4689]: E0307 04:45:03.968419 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee\": container with ID starting with 931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee not found: ID does not exist" containerID="931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.968445 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee"} err="failed to get container status \"931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee\": rpc error: code = NotFound desc = could not find container \"931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee\": container with ID starting with 931948494b4d466c3f8be4b1733b026421e3c019e52e54f5800146c57abf0aee not found: ID does not exist" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.968470 4689 scope.go:117] "RemoveContainer" containerID="ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.974392 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64bcb8dbcf-9hmhr"] Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.985691 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-mzlbx"] Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.986890 4689 scope.go:117] "RemoveContainer" containerID="ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695" Mar 07 04:45:03 crc kubenswrapper[4689]: E0307 04:45:03.987277 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695\": container with ID starting with ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695 not found: ID does not exist" containerID="ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.987307 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695"} err="failed to get container status \"ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695\": rpc error: code = NotFound desc = could not find container \"ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695\": container with ID starting with ce56a2496c6d07084a01ed213b4df3bcbd3f6eb0325f3de9048fe14b9bc4b695 not found: ID does not exist" Mar 07 04:45:03 crc kubenswrapper[4689]: I0307 04:45:03.989828 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-mzlbx"] Mar 07 04:45:05 crc kubenswrapper[4689]: I0307 04:45:05.842349 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44ccc3e9-523e-49f4-a647-87bad23b837f" path="/var/lib/kubelet/pods/44ccc3e9-523e-49f4-a647-87bad23b837f/volumes" Mar 07 04:45:05 crc kubenswrapper[4689]: I0307 04:45:05.843790 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee3b71b3-32e5-46a2-9f3e-589e7da005a4" path="/var/lib/kubelet/pods/ee3b71b3-32e5-46a2-9f3e-589e7da005a4/volumes" Mar 07 04:45:06 crc kubenswrapper[4689]: E0307 04:45:06.948068 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:45:06 crc kubenswrapper[4689]: E0307 04:45:06.948183 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:45:07.448148374 +0000 UTC m=+1552.494531863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:45:06 crc kubenswrapper[4689]: E0307 04:45:06.948161 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:45:06 crc kubenswrapper[4689]: E0307 04:45:06.948279 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:45:07.448257967 +0000 UTC m=+1552.494641466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:45:07 crc kubenswrapper[4689]: E0307 04:45:07.455064 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:45:07 crc kubenswrapper[4689]: E0307 04:45:07.455085 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:45:07 crc kubenswrapper[4689]: E0307 04:45:07.455205 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:45:08.455151999 +0000 UTC m=+1553.501535528 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:45:07 crc kubenswrapper[4689]: E0307 04:45:07.455252 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:45:08.45522217 +0000 UTC m=+1553.501605689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:45:07 crc kubenswrapper[4689]: I0307 04:45:07.995953 4689 generic.go:334] "Generic (PLEG): container finished" podID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerID="af3a3e3771dc5dcb25112f7477a92fb7553646ad838f63f1cf844231472aa223" exitCode=137 Mar 07 04:45:07 crc kubenswrapper[4689]: I0307 04:45:07.996141 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"af3a3e3771dc5dcb25112f7477a92fb7553646ad838f63f1cf844231472aa223"} Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.062614 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.168502 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-cache\") pod \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.168558 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjfbz\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-kube-api-access-mjfbz\") pod \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.168579 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.168668 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-lock\") pod \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.168700 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift\") pod \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\" (UID: \"72bf7dd5-1e66-47a7-ae3f-477fcfb02742\") " Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.169535 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-lock" (OuterVolumeSpecName: "lock") pod "72bf7dd5-1e66-47a7-ae3f-477fcfb02742" (UID: "72bf7dd5-1e66-47a7-ae3f-477fcfb02742"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.169569 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-cache" (OuterVolumeSpecName: "cache") pod "72bf7dd5-1e66-47a7-ae3f-477fcfb02742" (UID: "72bf7dd5-1e66-47a7-ae3f-477fcfb02742"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.173711 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-kube-api-access-mjfbz" (OuterVolumeSpecName: "kube-api-access-mjfbz") pod "72bf7dd5-1e66-47a7-ae3f-477fcfb02742" (UID: "72bf7dd5-1e66-47a7-ae3f-477fcfb02742"). InnerVolumeSpecName "kube-api-access-mjfbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.174220 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "swift") pod "72bf7dd5-1e66-47a7-ae3f-477fcfb02742" (UID: "72bf7dd5-1e66-47a7-ae3f-477fcfb02742"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.178297 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "72bf7dd5-1e66-47a7-ae3f-477fcfb02742" (UID: "72bf7dd5-1e66-47a7-ae3f-477fcfb02742"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.269886 4689 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-lock\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.269913 4689 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.269922 4689 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-cache\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.269931 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjfbz\" (UniqueName: \"kubernetes.io/projected/72bf7dd5-1e66-47a7-ae3f-477fcfb02742-kube-api-access-mjfbz\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.269951 4689 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.284664 4689 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Mar 07 04:45:08 crc kubenswrapper[4689]: I0307 04:45:08.371329 4689 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Mar 07 04:45:08 crc kubenswrapper[4689]: E0307 04:45:08.473519 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:45:08 crc kubenswrapper[4689]: E0307 04:45:08.473669 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:45:10.473635938 +0000 UTC m=+1555.520019477 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:45:08 crc kubenswrapper[4689]: E0307 04:45:08.473542 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:45:08 crc kubenswrapper[4689]: E0307 04:45:08.473787 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:45:10.473757152 +0000 UTC m=+1555.520140681 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.024383 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"72bf7dd5-1e66-47a7-ae3f-477fcfb02742","Type":"ContainerDied","Data":"8a0e80d31c1c06f9c555faba84f7fad9db38e5c920bc01a3b1c6d095ca12ab39"} Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.024447 4689 scope.go:117] "RemoveContainer" containerID="af3a3e3771dc5dcb25112f7477a92fb7553646ad838f63f1cf844231472aa223" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.024589 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.056842 4689 scope.go:117] "RemoveContainer" containerID="a460a955db79fbf911a926367b117b7f6ceb0c5df6dbccaeddba0833bd8d1785" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.075379 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.081375 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.086524 4689 scope.go:117] "RemoveContainer" containerID="1f89449ad1a80fea4286fef990935e960507f7bf84f0d843d58e0d743c6402d3" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.114812 4689 scope.go:117] "RemoveContainer" containerID="d0fa234be29bc574f8e8ac0e9059a71a0665d50a4a5e8587b656627fe358168d" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.139674 4689 scope.go:117] "RemoveContainer" containerID="fcc4fd1908f707c3a9b6e85c0ed1a296725aeb2ace62d4136b7cfdf7e4793cb9" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.169072 4689 scope.go:117] "RemoveContainer" containerID="838f0a0e47edc581a3586403409c00fe391e1a0446670e6c3dae72a34453a3a6" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.196647 4689 scope.go:117] "RemoveContainer" containerID="c9115983fb96eb604ca6eee60e5a2764c938cbe715a4b97fa1cffc9f4cfcf61f" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.215258 4689 scope.go:117] "RemoveContainer" containerID="df2b64bed9e2330912063f36cf4cceb10965d467dee93369db2730c1e257474e" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.233437 4689 scope.go:117] "RemoveContainer" containerID="faad358fd307a99964689f91a5acb7e967ffb6178743b7f718e092bf976a7e8d" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.249234 4689 scope.go:117] "RemoveContainer" containerID="1b3960e36d0b90b01c78ef9cdfc8857c059e03d4fc0b35cebbdbda9d25c2e743" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.265520 4689 scope.go:117] "RemoveContainer" containerID="1ce86596f91d66453e465f97afa8624aca1b2b8b2d59d3a5f990349cc84881ae" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.287271 4689 scope.go:117] "RemoveContainer" containerID="70864ec57f40a96c0cbd682f99e5cc28caf7680eef12aaeebbb5fef77b84ca71" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.310342 4689 scope.go:117] "RemoveContainer" containerID="95c6d4b787a84767360101ff6c8db1dcbf368d75db58bcc4657444d42e1121e2" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.338071 4689 scope.go:117] "RemoveContainer" containerID="4cf74cb6827c9d9ba68e8c8dfa337659418d093ae76fce1056d4f84b43758ab5" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.369165 4689 scope.go:117] "RemoveContainer" containerID="413ff247560a52a36969f0cf2f05c5b652d77df05c0d4413b58fcf079e14f38c" Mar 07 04:45:09 crc kubenswrapper[4689]: I0307 04:45:09.836963 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" path="/var/lib/kubelet/pods/72bf7dd5-1e66-47a7-ae3f-477fcfb02742/volumes" Mar 07 04:45:10 crc kubenswrapper[4689]: E0307 04:45:10.501374 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:45:10 crc kubenswrapper[4689]: E0307 04:45:10.501467 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:45:14.501444989 +0000 UTC m=+1559.547828518 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:45:10 crc kubenswrapper[4689]: E0307 04:45:10.501675 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:45:10 crc kubenswrapper[4689]: E0307 04:45:10.501786 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:45:14.501757848 +0000 UTC m=+1559.548141367 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:45:14 crc kubenswrapper[4689]: E0307 04:45:14.562740 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:45:14 crc kubenswrapper[4689]: E0307 04:45:14.563017 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:45:22.563002623 +0000 UTC m=+1567.609386112 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:45:14 crc kubenswrapper[4689]: E0307 04:45:14.562756 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:45:14 crc kubenswrapper[4689]: E0307 04:45:14.563350 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:45:22.563342172 +0000 UTC m=+1567.609725661 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.492920 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hdr7x/must-gather-kbnks"] Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493449 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-replicator" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493464 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-replicator" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493481 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ccc3e9-523e-49f4-a647-87bad23b837f" containerName="manager" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493489 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ccc3e9-523e-49f4-a647-87bad23b837f" containerName="manager" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493501 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="swift-recon-cron" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493510 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="swift-recon-cron" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493520 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-updater" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493527 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-updater" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493537 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="rsync" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493544 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="rsync" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493555 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9edd6ad0-247d-45f0-95e9-0291d649c6ec" containerName="manager" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493564 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="9edd6ad0-247d-45f0-95e9-0291d649c6ec" containerName="manager" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493576 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-replicator" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493585 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-replicator" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493598 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-auditor" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493606 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-auditor" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493614 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-updater" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493622 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-updater" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493634 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-replicator" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493644 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-replicator" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493658 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-server" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493666 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-server" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493676 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-expirer" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493684 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-expirer" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493693 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-auditor" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493700 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-auditor" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493711 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-auditor" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493719 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-auditor" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493732 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e" containerName="collect-profiles" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493740 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e" containerName="collect-profiles" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493751 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-reaper" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493759 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-reaper" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493772 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3b71b3-32e5-46a2-9f3e-589e7da005a4" containerName="registry-server" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493780 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3b71b3-32e5-46a2-9f3e-589e7da005a4" containerName="registry-server" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493790 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b07d06d3-554f-4c41-b001-e5d9338bbdf4" containerName="registry-server" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493798 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="b07d06d3-554f-4c41-b001-e5d9338bbdf4" containerName="registry-server" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493811 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-server" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493820 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-server" Mar 07 04:45:17 crc kubenswrapper[4689]: E0307 04:45:17.493834 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-server" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493842 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-server" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493958 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ccc3e9-523e-49f4-a647-87bad23b837f" containerName="manager" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493970 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-replicator" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493984 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-server" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.493993 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3b71b3-32e5-46a2-9f3e-589e7da005a4" containerName="registry-server" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494004 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-auditor" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494015 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="9edd6ad0-247d-45f0-95e9-0291d649c6ec" containerName="manager" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494024 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="rsync" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494033 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-replicator" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494043 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-server" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494051 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-updater" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494062 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-server" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494072 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-replicator" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494082 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="829af43e-2eb4-4bf1-a5bc-e5b6b1c96a6e" containerName="collect-profiles" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494094 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-updater" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494106 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="swift-recon-cron" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494118 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="b07d06d3-554f-4c41-b001-e5d9338bbdf4" containerName="registry-server" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494125 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-auditor" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494134 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="object-expirer" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494144 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="container-auditor" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.494232 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bf7dd5-1e66-47a7-ae3f-477fcfb02742" containerName="account-reaper" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.495093 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdr7x/must-gather-kbnks" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.499659 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hdr7x"/"kube-root-ca.crt" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.499997 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hdr7x"/"openshift-service-ca.crt" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.515908 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hdr7x/must-gather-kbnks"] Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.606424 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l99hm\" (UniqueName: \"kubernetes.io/projected/667e4097-0a9c-40d6-a15a-c7a0066085ac-kube-api-access-l99hm\") pod \"must-gather-kbnks\" (UID: \"667e4097-0a9c-40d6-a15a-c7a0066085ac\") " pod="openshift-must-gather-hdr7x/must-gather-kbnks" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.606598 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/667e4097-0a9c-40d6-a15a-c7a0066085ac-must-gather-output\") pod \"must-gather-kbnks\" (UID: \"667e4097-0a9c-40d6-a15a-c7a0066085ac\") " pod="openshift-must-gather-hdr7x/must-gather-kbnks" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.708081 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/667e4097-0a9c-40d6-a15a-c7a0066085ac-must-gather-output\") pod \"must-gather-kbnks\" (UID: \"667e4097-0a9c-40d6-a15a-c7a0066085ac\") " pod="openshift-must-gather-hdr7x/must-gather-kbnks" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.708260 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l99hm\" (UniqueName: \"kubernetes.io/projected/667e4097-0a9c-40d6-a15a-c7a0066085ac-kube-api-access-l99hm\") pod \"must-gather-kbnks\" (UID: \"667e4097-0a9c-40d6-a15a-c7a0066085ac\") " pod="openshift-must-gather-hdr7x/must-gather-kbnks" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.708593 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/667e4097-0a9c-40d6-a15a-c7a0066085ac-must-gather-output\") pod \"must-gather-kbnks\" (UID: \"667e4097-0a9c-40d6-a15a-c7a0066085ac\") " pod="openshift-must-gather-hdr7x/must-gather-kbnks" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.739894 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l99hm\" (UniqueName: \"kubernetes.io/projected/667e4097-0a9c-40d6-a15a-c7a0066085ac-kube-api-access-l99hm\") pod \"must-gather-kbnks\" (UID: \"667e4097-0a9c-40d6-a15a-c7a0066085ac\") " pod="openshift-must-gather-hdr7x/must-gather-kbnks" Mar 07 04:45:17 crc kubenswrapper[4689]: I0307 04:45:17.811941 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdr7x/must-gather-kbnks" Mar 07 04:45:18 crc kubenswrapper[4689]: I0307 04:45:18.200442 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hdr7x/must-gather-kbnks"] Mar 07 04:45:19 crc kubenswrapper[4689]: I0307 04:45:19.115960 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdr7x/must-gather-kbnks" event={"ID":"667e4097-0a9c-40d6-a15a-c7a0066085ac","Type":"ContainerStarted","Data":"6a26f60e005644d86313e567215292a1771b577a6e7a0e9dbcced6fd79e51d3f"} Mar 07 04:45:22 crc kubenswrapper[4689]: E0307 04:45:22.587652 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:45:22 crc kubenswrapper[4689]: E0307 04:45:22.587694 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:45:22 crc kubenswrapper[4689]: E0307 04:45:22.587980 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:45:38.58796219 +0000 UTC m=+1583.634345679 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:45:22 crc kubenswrapper[4689]: E0307 04:45:22.588131 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:45:38.588088133 +0000 UTC m=+1583.634471712 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:45:24 crc kubenswrapper[4689]: I0307 04:45:24.158637 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdr7x/must-gather-kbnks" event={"ID":"667e4097-0a9c-40d6-a15a-c7a0066085ac","Type":"ContainerStarted","Data":"606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2"} Mar 07 04:45:25 crc kubenswrapper[4689]: I0307 04:45:25.166675 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdr7x/must-gather-kbnks" event={"ID":"667e4097-0a9c-40d6-a15a-c7a0066085ac","Type":"ContainerStarted","Data":"a48b6d2558cc15e286e2a64eb59f9aadc627de689e0162a127a3bb9b33e1f470"} Mar 07 04:45:25 crc kubenswrapper[4689]: I0307 04:45:25.181922 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hdr7x/must-gather-kbnks" podStartSLOduration=2.567433485 podStartE2EDuration="8.181907793s" podCreationTimestamp="2026-03-07 04:45:17 +0000 UTC" firstStartedPulling="2026-03-07 04:45:18.20938268 +0000 UTC m=+1563.255766169" lastFinishedPulling="2026-03-07 04:45:23.823856978 +0000 UTC m=+1568.870240477" observedRunningTime="2026-03-07 04:45:25.178345126 +0000 UTC m=+1570.224728605" watchObservedRunningTime="2026-03-07 04:45:25.181907793 +0000 UTC m=+1570.228291282" Mar 07 04:45:29 crc kubenswrapper[4689]: I0307 04:45:29.189564 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:45:29 crc kubenswrapper[4689]: I0307 04:45:29.189618 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:45:29 crc kubenswrapper[4689]: I0307 04:45:29.189660 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:45:29 crc kubenswrapper[4689]: I0307 04:45:29.190163 4689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536"} pod="openshift-machine-config-operator/machine-config-daemon-dss5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 04:45:29 crc kubenswrapper[4689]: I0307 04:45:29.190233 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" containerID="cri-o://84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" gracePeriod=600 Mar 07 04:45:30 crc kubenswrapper[4689]: E0307 04:45:30.107483 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:45:30 crc kubenswrapper[4689]: I0307 04:45:30.202162 4689 generic.go:334] "Generic (PLEG): container finished" podID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" exitCode=0 Mar 07 04:45:30 crc kubenswrapper[4689]: I0307 04:45:30.202238 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerDied","Data":"84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536"} Mar 07 04:45:30 crc kubenswrapper[4689]: I0307 04:45:30.202341 4689 scope.go:117] "RemoveContainer" containerID="1d7f7f5d4bedb9f0999f9f7b5b22121b12b61459642fd73d8cbc908ec8691b15" Mar 07 04:45:30 crc kubenswrapper[4689]: I0307 04:45:30.202979 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:45:30 crc kubenswrapper[4689]: E0307 04:45:30.203553 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:45:38 crc kubenswrapper[4689]: E0307 04:45:38.602711 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:45:38 crc kubenswrapper[4689]: E0307 04:45:38.603286 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:46:10.603270501 +0000 UTC m=+1615.649653990 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:45:38 crc kubenswrapper[4689]: E0307 04:45:38.602817 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:45:38 crc kubenswrapper[4689]: E0307 04:45:38.603404 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:46:10.603372064 +0000 UTC m=+1615.649755593 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:45:44 crc kubenswrapper[4689]: I0307 04:45:44.825970 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:45:44 crc kubenswrapper[4689]: E0307 04:45:44.826738 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:45:46 crc kubenswrapper[4689]: I0307 04:45:46.744633 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xhc66"] Mar 07 04:45:46 crc kubenswrapper[4689]: I0307 04:45:46.746020 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:46 crc kubenswrapper[4689]: I0307 04:45:46.760083 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xhc66"] Mar 07 04:45:46 crc kubenswrapper[4689]: I0307 04:45:46.822617 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def1cdb6-7ee9-4587-97d1-01066d6bdb16-catalog-content\") pod \"certified-operators-xhc66\" (UID: \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\") " pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:46 crc kubenswrapper[4689]: I0307 04:45:46.822854 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2hng\" (UniqueName: \"kubernetes.io/projected/def1cdb6-7ee9-4587-97d1-01066d6bdb16-kube-api-access-v2hng\") pod \"certified-operators-xhc66\" (UID: \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\") " pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:46 crc kubenswrapper[4689]: I0307 04:45:46.823071 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def1cdb6-7ee9-4587-97d1-01066d6bdb16-utilities\") pod \"certified-operators-xhc66\" (UID: \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\") " pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:46 crc kubenswrapper[4689]: I0307 04:45:46.924957 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2hng\" (UniqueName: \"kubernetes.io/projected/def1cdb6-7ee9-4587-97d1-01066d6bdb16-kube-api-access-v2hng\") pod \"certified-operators-xhc66\" (UID: \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\") " pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:46 crc kubenswrapper[4689]: I0307 04:45:46.925066 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def1cdb6-7ee9-4587-97d1-01066d6bdb16-utilities\") pod \"certified-operators-xhc66\" (UID: \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\") " pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:46 crc kubenswrapper[4689]: I0307 04:45:46.925102 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def1cdb6-7ee9-4587-97d1-01066d6bdb16-catalog-content\") pod \"certified-operators-xhc66\" (UID: \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\") " pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:46 crc kubenswrapper[4689]: I0307 04:45:46.925612 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def1cdb6-7ee9-4587-97d1-01066d6bdb16-utilities\") pod \"certified-operators-xhc66\" (UID: \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\") " pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:46 crc kubenswrapper[4689]: I0307 04:45:46.925963 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def1cdb6-7ee9-4587-97d1-01066d6bdb16-catalog-content\") pod \"certified-operators-xhc66\" (UID: \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\") " pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:46 crc kubenswrapper[4689]: I0307 04:45:46.944293 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2hng\" (UniqueName: \"kubernetes.io/projected/def1cdb6-7ee9-4587-97d1-01066d6bdb16-kube-api-access-v2hng\") pod \"certified-operators-xhc66\" (UID: \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\") " pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:47 crc kubenswrapper[4689]: I0307 04:45:47.072485 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:47 crc kubenswrapper[4689]: I0307 04:45:47.500954 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xhc66"] Mar 07 04:45:48 crc kubenswrapper[4689]: I0307 04:45:48.340083 4689 generic.go:334] "Generic (PLEG): container finished" podID="def1cdb6-7ee9-4587-97d1-01066d6bdb16" containerID="647742d10a3532198d2e75c8f34248bcb420e699e166f8807b4b270acb77b703" exitCode=0 Mar 07 04:45:48 crc kubenswrapper[4689]: I0307 04:45:48.340289 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhc66" event={"ID":"def1cdb6-7ee9-4587-97d1-01066d6bdb16","Type":"ContainerDied","Data":"647742d10a3532198d2e75c8f34248bcb420e699e166f8807b4b270acb77b703"} Mar 07 04:45:48 crc kubenswrapper[4689]: I0307 04:45:48.340378 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhc66" event={"ID":"def1cdb6-7ee9-4587-97d1-01066d6bdb16","Type":"ContainerStarted","Data":"c5db016fef2298fb766a5d2ce51c83c2c7edabceac09a1f9788de9c736f463bc"} Mar 07 04:45:49 crc kubenswrapper[4689]: I0307 04:45:49.348721 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhc66" event={"ID":"def1cdb6-7ee9-4587-97d1-01066d6bdb16","Type":"ContainerStarted","Data":"b26f2f2e4e3229f35719039c808314eb14c3c586b0df48ff17d118d8c80cf1a8"} Mar 07 04:45:50 crc kubenswrapper[4689]: I0307 04:45:50.360039 4689 generic.go:334] "Generic (PLEG): container finished" podID="def1cdb6-7ee9-4587-97d1-01066d6bdb16" containerID="b26f2f2e4e3229f35719039c808314eb14c3c586b0df48ff17d118d8c80cf1a8" exitCode=0 Mar 07 04:45:50 crc kubenswrapper[4689]: I0307 04:45:50.360117 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhc66" event={"ID":"def1cdb6-7ee9-4587-97d1-01066d6bdb16","Type":"ContainerDied","Data":"b26f2f2e4e3229f35719039c808314eb14c3c586b0df48ff17d118d8c80cf1a8"} Mar 07 04:45:51 crc kubenswrapper[4689]: I0307 04:45:51.376756 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhc66" event={"ID":"def1cdb6-7ee9-4587-97d1-01066d6bdb16","Type":"ContainerStarted","Data":"cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac"} Mar 07 04:45:51 crc kubenswrapper[4689]: I0307 04:45:51.409263 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xhc66" podStartSLOduration=2.967882247 podStartE2EDuration="5.40923504s" podCreationTimestamp="2026-03-07 04:45:46 +0000 UTC" firstStartedPulling="2026-03-07 04:45:48.342498727 +0000 UTC m=+1593.388882216" lastFinishedPulling="2026-03-07 04:45:50.7838515 +0000 UTC m=+1595.830235009" observedRunningTime="2026-03-07 04:45:51.402855687 +0000 UTC m=+1596.449239256" watchObservedRunningTime="2026-03-07 04:45:51.40923504 +0000 UTC m=+1596.455618569" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.593159 4689 scope.go:117] "RemoveContainer" containerID="eb30690f7cfbde23b181a73ce131bea2aa55e2dfb9c338b2b159325a96c8afe7" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.613807 4689 scope.go:117] "RemoveContainer" containerID="37fc54a838abbd84f29cf51bdff86617b52601ce04d177aa8f6a9684fa59f10e" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.650784 4689 scope.go:117] "RemoveContainer" containerID="a9a14cb1784c5d48776ae86aaeb6eb42392ff6406deca6fe1b8f76e6a1ff7cc7" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.674682 4689 scope.go:117] "RemoveContainer" containerID="37a29d50dfdb5292360a814d6cc16d4e330cc2410daace4af257df61f7af8260" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.699824 4689 scope.go:117] "RemoveContainer" containerID="12513990d7ac6e8d6b3f612b66e2585a45622b21da1c3761348fca6184dca979" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.733371 4689 scope.go:117] "RemoveContainer" containerID="05b2be4ed567110edd549eb6ed857425bac74c0325e23a8df744b6cc8cb33ca8" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.753237 4689 scope.go:117] "RemoveContainer" containerID="5d6a86917778a56ef7e6a6e34b7267230910937ec2cffcd68f2763f316f886d9" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.771200 4689 scope.go:117] "RemoveContainer" containerID="b02927d1de884d5663972f872f84754a8034f7702a4b935529b0f8e12587ce6e" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.787358 4689 scope.go:117] "RemoveContainer" containerID="20e88a6e6b94587cc0815d0862d76cc505bad41e0e860cf5e1724f7f08245876" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.802287 4689 scope.go:117] "RemoveContainer" containerID="e3269f3d9967a4c91ba27c543fa32589a9378c0078bea79b1a2bf3a9704a94bc" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.820240 4689 scope.go:117] "RemoveContainer" containerID="2af979f74a740f172876998eb5fab2143962635ed30c4581a1c075c83143a1e6" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.861399 4689 scope.go:117] "RemoveContainer" containerID="59b9470ac28854b2cd352e30c2faeaae9c7143dd5cd89f202217c105ff4dc226" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.885602 4689 scope.go:117] "RemoveContainer" containerID="72d3c13287afb9eebc18fb3588b7edfdd594507fe230f25ccb60e256a61abcd1" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.910409 4689 scope.go:117] "RemoveContainer" containerID="4240dafe00147626cda55455f2cd59a98d2e15661dba5b0ed2b113b507baf83e" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.929391 4689 scope.go:117] "RemoveContainer" containerID="ca950ee2382d076ca1acd44912cd90dac80f7886c7531c374f37d57c8d68baa9" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.944288 4689 scope.go:117] "RemoveContainer" containerID="437d29f2802f7fdeac8810f3b54b21fdbf943ed5cf5e20263f57909ee89728a1" Mar 07 04:45:52 crc kubenswrapper[4689]: I0307 04:45:52.960407 4689 scope.go:117] "RemoveContainer" containerID="9cafcc6cec675f507108af2538822d2ec2e7bc27678e7e2758e3df9f8b5b19df" Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.128320 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6l64j"] Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.133911 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.141086 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6l64j"] Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.238302 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d737c925-6772-46fe-b0d5-3779bff2aea0-catalog-content\") pod \"redhat-marketplace-6l64j\" (UID: \"d737c925-6772-46fe-b0d5-3779bff2aea0\") " pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.238444 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rjns\" (UniqueName: \"kubernetes.io/projected/d737c925-6772-46fe-b0d5-3779bff2aea0-kube-api-access-9rjns\") pod \"redhat-marketplace-6l64j\" (UID: \"d737c925-6772-46fe-b0d5-3779bff2aea0\") " pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.238479 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d737c925-6772-46fe-b0d5-3779bff2aea0-utilities\") pod \"redhat-marketplace-6l64j\" (UID: \"d737c925-6772-46fe-b0d5-3779bff2aea0\") " pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.339843 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d737c925-6772-46fe-b0d5-3779bff2aea0-catalog-content\") pod \"redhat-marketplace-6l64j\" (UID: \"d737c925-6772-46fe-b0d5-3779bff2aea0\") " pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.340073 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rjns\" (UniqueName: \"kubernetes.io/projected/d737c925-6772-46fe-b0d5-3779bff2aea0-kube-api-access-9rjns\") pod \"redhat-marketplace-6l64j\" (UID: \"d737c925-6772-46fe-b0d5-3779bff2aea0\") " pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.340153 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d737c925-6772-46fe-b0d5-3779bff2aea0-utilities\") pod \"redhat-marketplace-6l64j\" (UID: \"d737c925-6772-46fe-b0d5-3779bff2aea0\") " pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.340864 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d737c925-6772-46fe-b0d5-3779bff2aea0-catalog-content\") pod \"redhat-marketplace-6l64j\" (UID: \"d737c925-6772-46fe-b0d5-3779bff2aea0\") " pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.340873 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d737c925-6772-46fe-b0d5-3779bff2aea0-utilities\") pod \"redhat-marketplace-6l64j\" (UID: \"d737c925-6772-46fe-b0d5-3779bff2aea0\") " pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.370264 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rjns\" (UniqueName: \"kubernetes.io/projected/d737c925-6772-46fe-b0d5-3779bff2aea0-kube-api-access-9rjns\") pod \"redhat-marketplace-6l64j\" (UID: \"d737c925-6772-46fe-b0d5-3779bff2aea0\") " pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.488138 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:45:54 crc kubenswrapper[4689]: I0307 04:45:54.925088 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6l64j"] Mar 07 04:45:55 crc kubenswrapper[4689]: I0307 04:45:55.406968 4689 generic.go:334] "Generic (PLEG): container finished" podID="d737c925-6772-46fe-b0d5-3779bff2aea0" containerID="a4fb32bcead719c2d73412506b5eef63823aa3d0d8ba63b5521d5839c3f27dfd" exitCode=0 Mar 07 04:45:55 crc kubenswrapper[4689]: I0307 04:45:55.407064 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6l64j" event={"ID":"d737c925-6772-46fe-b0d5-3779bff2aea0","Type":"ContainerDied","Data":"a4fb32bcead719c2d73412506b5eef63823aa3d0d8ba63b5521d5839c3f27dfd"} Mar 07 04:45:55 crc kubenswrapper[4689]: I0307 04:45:55.407286 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6l64j" event={"ID":"d737c925-6772-46fe-b0d5-3779bff2aea0","Type":"ContainerStarted","Data":"fa2dd73bbfaebace004f690402f3ddf58a835aae42badabdaa29ecc46f8a11ba"} Mar 07 04:45:56 crc kubenswrapper[4689]: I0307 04:45:56.417316 4689 generic.go:334] "Generic (PLEG): container finished" podID="d737c925-6772-46fe-b0d5-3779bff2aea0" containerID="08ae0bacc463aec25f4e142e5a19d9b453f6000b7d86085f227b078df444c0be" exitCode=0 Mar 07 04:45:56 crc kubenswrapper[4689]: I0307 04:45:56.417367 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6l64j" event={"ID":"d737c925-6772-46fe-b0d5-3779bff2aea0","Type":"ContainerDied","Data":"08ae0bacc463aec25f4e142e5a19d9b453f6000b7d86085f227b078df444c0be"} Mar 07 04:45:57 crc kubenswrapper[4689]: I0307 04:45:57.073463 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:57 crc kubenswrapper[4689]: I0307 04:45:57.073855 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:57 crc kubenswrapper[4689]: I0307 04:45:57.126625 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:57 crc kubenswrapper[4689]: I0307 04:45:57.427318 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6l64j" event={"ID":"d737c925-6772-46fe-b0d5-3779bff2aea0","Type":"ContainerStarted","Data":"82cfaca1ff5f278d3e23be54e8f11c65112846a7b073872d411e80aa77868de6"} Mar 07 04:45:57 crc kubenswrapper[4689]: I0307 04:45:57.451765 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6l64j" podStartSLOduration=2.031952478 podStartE2EDuration="3.451746049s" podCreationTimestamp="2026-03-07 04:45:54 +0000 UTC" firstStartedPulling="2026-03-07 04:45:55.408764338 +0000 UTC m=+1600.455147837" lastFinishedPulling="2026-03-07 04:45:56.828557919 +0000 UTC m=+1601.874941408" observedRunningTime="2026-03-07 04:45:57.447459404 +0000 UTC m=+1602.493842903" watchObservedRunningTime="2026-03-07 04:45:57.451746049 +0000 UTC m=+1602.498129538" Mar 07 04:45:57 crc kubenswrapper[4689]: I0307 04:45:57.472885 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:45:57 crc kubenswrapper[4689]: I0307 04:45:57.826318 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:45:57 crc kubenswrapper[4689]: E0307 04:45:57.826790 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:45:59 crc kubenswrapper[4689]: I0307 04:45:59.518970 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xhc66"] Mar 07 04:45:59 crc kubenswrapper[4689]: I0307 04:45:59.519625 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xhc66" podUID="def1cdb6-7ee9-4587-97d1-01066d6bdb16" containerName="registry-server" containerID="cri-o://cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac" gracePeriod=2 Mar 07 04:46:00 crc kubenswrapper[4689]: I0307 04:46:00.154910 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547646-p8rj9"] Mar 07 04:46:00 crc kubenswrapper[4689]: I0307 04:46:00.155816 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547646-p8rj9" Mar 07 04:46:00 crc kubenswrapper[4689]: I0307 04:46:00.158609 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:46:00 crc kubenswrapper[4689]: I0307 04:46:00.160113 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:46:00 crc kubenswrapper[4689]: I0307 04:46:00.166312 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:46:00 crc kubenswrapper[4689]: I0307 04:46:00.177938 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547646-p8rj9"] Mar 07 04:46:00 crc kubenswrapper[4689]: I0307 04:46:00.220154 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbfbr\" (UniqueName: \"kubernetes.io/projected/305d3ac0-fe32-4daf-90bb-4a57426aed26-kube-api-access-xbfbr\") pod \"auto-csr-approver-29547646-p8rj9\" (UID: \"305d3ac0-fe32-4daf-90bb-4a57426aed26\") " pod="openshift-infra/auto-csr-approver-29547646-p8rj9" Mar 07 04:46:00 crc kubenswrapper[4689]: I0307 04:46:00.321881 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbfbr\" (UniqueName: \"kubernetes.io/projected/305d3ac0-fe32-4daf-90bb-4a57426aed26-kube-api-access-xbfbr\") pod \"auto-csr-approver-29547646-p8rj9\" (UID: \"305d3ac0-fe32-4daf-90bb-4a57426aed26\") " pod="openshift-infra/auto-csr-approver-29547646-p8rj9" Mar 07 04:46:00 crc kubenswrapper[4689]: I0307 04:46:00.349624 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbfbr\" (UniqueName: \"kubernetes.io/projected/305d3ac0-fe32-4daf-90bb-4a57426aed26-kube-api-access-xbfbr\") pod \"auto-csr-approver-29547646-p8rj9\" (UID: \"305d3ac0-fe32-4daf-90bb-4a57426aed26\") " pod="openshift-infra/auto-csr-approver-29547646-p8rj9" Mar 07 04:46:00 crc kubenswrapper[4689]: I0307 04:46:00.485964 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547646-p8rj9" Mar 07 04:46:00 crc kubenswrapper[4689]: I0307 04:46:00.750446 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547646-p8rj9"] Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.005036 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.134601 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2hng\" (UniqueName: \"kubernetes.io/projected/def1cdb6-7ee9-4587-97d1-01066d6bdb16-kube-api-access-v2hng\") pod \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\" (UID: \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\") " Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.134702 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def1cdb6-7ee9-4587-97d1-01066d6bdb16-catalog-content\") pod \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\" (UID: \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\") " Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.134738 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def1cdb6-7ee9-4587-97d1-01066d6bdb16-utilities\") pod \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\" (UID: \"def1cdb6-7ee9-4587-97d1-01066d6bdb16\") " Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.136487 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def1cdb6-7ee9-4587-97d1-01066d6bdb16-utilities" (OuterVolumeSpecName: "utilities") pod "def1cdb6-7ee9-4587-97d1-01066d6bdb16" (UID: "def1cdb6-7ee9-4587-97d1-01066d6bdb16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.142928 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def1cdb6-7ee9-4587-97d1-01066d6bdb16-kube-api-access-v2hng" (OuterVolumeSpecName: "kube-api-access-v2hng") pod "def1cdb6-7ee9-4587-97d1-01066d6bdb16" (UID: "def1cdb6-7ee9-4587-97d1-01066d6bdb16"). InnerVolumeSpecName "kube-api-access-v2hng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.221596 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def1cdb6-7ee9-4587-97d1-01066d6bdb16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "def1cdb6-7ee9-4587-97d1-01066d6bdb16" (UID: "def1cdb6-7ee9-4587-97d1-01066d6bdb16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.236452 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2hng\" (UniqueName: \"kubernetes.io/projected/def1cdb6-7ee9-4587-97d1-01066d6bdb16-kube-api-access-v2hng\") on node \"crc\" DevicePath \"\"" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.236488 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def1cdb6-7ee9-4587-97d1-01066d6bdb16-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.236502 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def1cdb6-7ee9-4587-97d1-01066d6bdb16-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.323919 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vbhd2"] Mar 07 04:46:01 crc kubenswrapper[4689]: E0307 04:46:01.324272 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def1cdb6-7ee9-4587-97d1-01066d6bdb16" containerName="extract-utilities" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.324300 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="def1cdb6-7ee9-4587-97d1-01066d6bdb16" containerName="extract-utilities" Mar 07 04:46:01 crc kubenswrapper[4689]: E0307 04:46:01.324341 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def1cdb6-7ee9-4587-97d1-01066d6bdb16" containerName="extract-content" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.324354 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="def1cdb6-7ee9-4587-97d1-01066d6bdb16" containerName="extract-content" Mar 07 04:46:01 crc kubenswrapper[4689]: E0307 04:46:01.324373 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def1cdb6-7ee9-4587-97d1-01066d6bdb16" containerName="registry-server" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.324385 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="def1cdb6-7ee9-4587-97d1-01066d6bdb16" containerName="registry-server" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.324559 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="def1cdb6-7ee9-4587-97d1-01066d6bdb16" containerName="registry-server" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.325739 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.339084 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbhd2"] Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.438380 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5stzz\" (UniqueName: \"kubernetes.io/projected/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-kube-api-access-5stzz\") pod \"community-operators-vbhd2\" (UID: \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\") " pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.438424 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-catalog-content\") pod \"community-operators-vbhd2\" (UID: \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\") " pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.438491 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-utilities\") pod \"community-operators-vbhd2\" (UID: \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\") " pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.458456 4689 generic.go:334] "Generic (PLEG): container finished" podID="def1cdb6-7ee9-4587-97d1-01066d6bdb16" containerID="cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac" exitCode=0 Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.458518 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhc66" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.458578 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhc66" event={"ID":"def1cdb6-7ee9-4587-97d1-01066d6bdb16","Type":"ContainerDied","Data":"cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac"} Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.458606 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhc66" event={"ID":"def1cdb6-7ee9-4587-97d1-01066d6bdb16","Type":"ContainerDied","Data":"c5db016fef2298fb766a5d2ce51c83c2c7edabceac09a1f9788de9c736f463bc"} Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.458639 4689 scope.go:117] "RemoveContainer" containerID="cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.460349 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547646-p8rj9" event={"ID":"305d3ac0-fe32-4daf-90bb-4a57426aed26","Type":"ContainerStarted","Data":"39035ddc0fe41c4a31dff66afc005ce43f1f034fad5972c0e041605af85a796e"} Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.481008 4689 scope.go:117] "RemoveContainer" containerID="b26f2f2e4e3229f35719039c808314eb14c3c586b0df48ff17d118d8c80cf1a8" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.506417 4689 scope.go:117] "RemoveContainer" containerID="647742d10a3532198d2e75c8f34248bcb420e699e166f8807b4b270acb77b703" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.510128 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xhc66"] Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.516657 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xhc66"] Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.533664 4689 scope.go:117] "RemoveContainer" containerID="cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac" Mar 07 04:46:01 crc kubenswrapper[4689]: E0307 04:46:01.536653 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac\": container with ID starting with cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac not found: ID does not exist" containerID="cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.536690 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac"} err="failed to get container status \"cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac\": rpc error: code = NotFound desc = could not find container \"cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac\": container with ID starting with cb0036ab0e6cf3c26bd8d775b3418de639b609243f2003123827543aeb49bfac not found: ID does not exist" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.536717 4689 scope.go:117] "RemoveContainer" containerID="b26f2f2e4e3229f35719039c808314eb14c3c586b0df48ff17d118d8c80cf1a8" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.539475 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5stzz\" (UniqueName: \"kubernetes.io/projected/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-kube-api-access-5stzz\") pod \"community-operators-vbhd2\" (UID: \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\") " pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.539504 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-catalog-content\") pod \"community-operators-vbhd2\" (UID: \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\") " pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.539561 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-utilities\") pod \"community-operators-vbhd2\" (UID: \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\") " pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.539950 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-utilities\") pod \"community-operators-vbhd2\" (UID: \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\") " pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.540404 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-catalog-content\") pod \"community-operators-vbhd2\" (UID: \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\") " pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:01 crc kubenswrapper[4689]: E0307 04:46:01.540471 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26f2f2e4e3229f35719039c808314eb14c3c586b0df48ff17d118d8c80cf1a8\": container with ID starting with b26f2f2e4e3229f35719039c808314eb14c3c586b0df48ff17d118d8c80cf1a8 not found: ID does not exist" containerID="b26f2f2e4e3229f35719039c808314eb14c3c586b0df48ff17d118d8c80cf1a8" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.540490 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26f2f2e4e3229f35719039c808314eb14c3c586b0df48ff17d118d8c80cf1a8"} err="failed to get container status \"b26f2f2e4e3229f35719039c808314eb14c3c586b0df48ff17d118d8c80cf1a8\": rpc error: code = NotFound desc = could not find container \"b26f2f2e4e3229f35719039c808314eb14c3c586b0df48ff17d118d8c80cf1a8\": container with ID starting with b26f2f2e4e3229f35719039c808314eb14c3c586b0df48ff17d118d8c80cf1a8 not found: ID does not exist" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.540505 4689 scope.go:117] "RemoveContainer" containerID="647742d10a3532198d2e75c8f34248bcb420e699e166f8807b4b270acb77b703" Mar 07 04:46:01 crc kubenswrapper[4689]: E0307 04:46:01.544324 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"647742d10a3532198d2e75c8f34248bcb420e699e166f8807b4b270acb77b703\": container with ID starting with 647742d10a3532198d2e75c8f34248bcb420e699e166f8807b4b270acb77b703 not found: ID does not exist" containerID="647742d10a3532198d2e75c8f34248bcb420e699e166f8807b4b270acb77b703" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.544351 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"647742d10a3532198d2e75c8f34248bcb420e699e166f8807b4b270acb77b703"} err="failed to get container status \"647742d10a3532198d2e75c8f34248bcb420e699e166f8807b4b270acb77b703\": rpc error: code = NotFound desc = could not find container \"647742d10a3532198d2e75c8f34248bcb420e699e166f8807b4b270acb77b703\": container with ID starting with 647742d10a3532198d2e75c8f34248bcb420e699e166f8807b4b270acb77b703 not found: ID does not exist" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.567444 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5stzz\" (UniqueName: \"kubernetes.io/projected/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-kube-api-access-5stzz\") pod \"community-operators-vbhd2\" (UID: \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\") " pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.639074 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:01 crc kubenswrapper[4689]: I0307 04:46:01.848360 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def1cdb6-7ee9-4587-97d1-01066d6bdb16" path="/var/lib/kubelet/pods/def1cdb6-7ee9-4587-97d1-01066d6bdb16/volumes" Mar 07 04:46:02 crc kubenswrapper[4689]: I0307 04:46:02.100981 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbhd2"] Mar 07 04:46:02 crc kubenswrapper[4689]: W0307 04:46:02.107718 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d801f99_7686_4be7_a3fb_4f1971ca3a1d.slice/crio-d1799acaf53e67116b029a3e3be323da66773530c1081482495170cbd36164a0 WatchSource:0}: Error finding container d1799acaf53e67116b029a3e3be323da66773530c1081482495170cbd36164a0: Status 404 returned error can't find the container with id d1799acaf53e67116b029a3e3be323da66773530c1081482495170cbd36164a0 Mar 07 04:46:02 crc kubenswrapper[4689]: I0307 04:46:02.469663 4689 generic.go:334] "Generic (PLEG): container finished" podID="305d3ac0-fe32-4daf-90bb-4a57426aed26" containerID="1f48436e09c96bae6a439165eb8ed88a9d65798e71a8632de5e6cfa33f44821d" exitCode=0 Mar 07 04:46:02 crc kubenswrapper[4689]: I0307 04:46:02.469841 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547646-p8rj9" event={"ID":"305d3ac0-fe32-4daf-90bb-4a57426aed26","Type":"ContainerDied","Data":"1f48436e09c96bae6a439165eb8ed88a9d65798e71a8632de5e6cfa33f44821d"} Mar 07 04:46:02 crc kubenswrapper[4689]: I0307 04:46:02.471238 4689 generic.go:334] "Generic (PLEG): container finished" podID="4d801f99-7686-4be7-a3fb-4f1971ca3a1d" containerID="54664c938cd28ce1e0734d807397d6c8d69af52a6c3aaa73c787fd8d39fe404e" exitCode=0 Mar 07 04:46:02 crc kubenswrapper[4689]: I0307 04:46:02.471278 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhd2" event={"ID":"4d801f99-7686-4be7-a3fb-4f1971ca3a1d","Type":"ContainerDied","Data":"54664c938cd28ce1e0734d807397d6c8d69af52a6c3aaa73c787fd8d39fe404e"} Mar 07 04:46:02 crc kubenswrapper[4689]: I0307 04:46:02.471307 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhd2" event={"ID":"4d801f99-7686-4be7-a3fb-4f1971ca3a1d","Type":"ContainerStarted","Data":"d1799acaf53e67116b029a3e3be323da66773530c1081482495170cbd36164a0"} Mar 07 04:46:03 crc kubenswrapper[4689]: I0307 04:46:03.478203 4689 generic.go:334] "Generic (PLEG): container finished" podID="4d801f99-7686-4be7-a3fb-4f1971ca3a1d" containerID="94f9ee9b4fec3148c4ab29c886031d7adc57ef5c1037342e12aebef8378c55e7" exitCode=0 Mar 07 04:46:03 crc kubenswrapper[4689]: I0307 04:46:03.479115 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhd2" event={"ID":"4d801f99-7686-4be7-a3fb-4f1971ca3a1d","Type":"ContainerDied","Data":"94f9ee9b4fec3148c4ab29c886031d7adc57ef5c1037342e12aebef8378c55e7"} Mar 07 04:46:03 crc kubenswrapper[4689]: I0307 04:46:03.747093 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547646-p8rj9" Mar 07 04:46:03 crc kubenswrapper[4689]: I0307 04:46:03.785896 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbfbr\" (UniqueName: \"kubernetes.io/projected/305d3ac0-fe32-4daf-90bb-4a57426aed26-kube-api-access-xbfbr\") pod \"305d3ac0-fe32-4daf-90bb-4a57426aed26\" (UID: \"305d3ac0-fe32-4daf-90bb-4a57426aed26\") " Mar 07 04:46:03 crc kubenswrapper[4689]: I0307 04:46:03.796677 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305d3ac0-fe32-4daf-90bb-4a57426aed26-kube-api-access-xbfbr" (OuterVolumeSpecName: "kube-api-access-xbfbr") pod "305d3ac0-fe32-4daf-90bb-4a57426aed26" (UID: "305d3ac0-fe32-4daf-90bb-4a57426aed26"). InnerVolumeSpecName "kube-api-access-xbfbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:46:03 crc kubenswrapper[4689]: I0307 04:46:03.887849 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbfbr\" (UniqueName: \"kubernetes.io/projected/305d3ac0-fe32-4daf-90bb-4a57426aed26-kube-api-access-xbfbr\") on node \"crc\" DevicePath \"\"" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.084692 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/util/0.log" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.265919 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/util/0.log" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.307003 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/pull/0.log" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.308780 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/pull/0.log" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.490379 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.490645 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.495616 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/pull/0.log" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.495751 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/util/0.log" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.504369 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547646-p8rj9" event={"ID":"305d3ac0-fe32-4daf-90bb-4a57426aed26","Type":"ContainerDied","Data":"39035ddc0fe41c4a31dff66afc005ce43f1f034fad5972c0e041605af85a796e"} Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.504399 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547646-p8rj9" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.504404 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39035ddc0fe41c4a31dff66afc005ce43f1f034fad5972c0e041605af85a796e" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.506453 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhd2" event={"ID":"4d801f99-7686-4be7-a3fb-4f1971ca3a1d","Type":"ContainerStarted","Data":"9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee"} Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.514855 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/extract/0.log" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.523617 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vbhd2" podStartSLOduration=1.85998433 podStartE2EDuration="3.523600802s" podCreationTimestamp="2026-03-07 04:46:01 +0000 UTC" firstStartedPulling="2026-03-07 04:46:02.472627146 +0000 UTC m=+1607.519010645" lastFinishedPulling="2026-03-07 04:46:04.136243628 +0000 UTC m=+1609.182627117" observedRunningTime="2026-03-07 04:46:04.518885456 +0000 UTC m=+1609.565268945" watchObservedRunningTime="2026-03-07 04:46:04.523600802 +0000 UTC m=+1609.569984291" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.548646 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.696754 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5df65d59b6-8hmtq_96b13de3-e5e2-456c-8b92-fb7adb492a65/manager/0.log" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.757057 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-2rfwv_4fbda293-a134-43ca-8f42-6bc32bae4b57/registry-server/0.log" Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.814680 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547640-t6v74"] Mar 07 04:46:04 crc kubenswrapper[4689]: I0307 04:46:04.817909 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547640-t6v74"] Mar 07 04:46:05 crc kubenswrapper[4689]: I0307 04:46:05.562302 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:46:05 crc kubenswrapper[4689]: I0307 04:46:05.834150 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eced259c-8292-4c5f-9698-a6830b08653a" path="/var/lib/kubelet/pods/eced259c-8292-4c5f-9698-a6830b08653a/volumes" Mar 07 04:46:07 crc kubenswrapper[4689]: I0307 04:46:07.718248 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6l64j"] Mar 07 04:46:07 crc kubenswrapper[4689]: I0307 04:46:07.719103 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6l64j" podUID="d737c925-6772-46fe-b0d5-3779bff2aea0" containerName="registry-server" containerID="cri-o://82cfaca1ff5f278d3e23be54e8f11c65112846a7b073872d411e80aa77868de6" gracePeriod=2 Mar 07 04:46:08 crc kubenswrapper[4689]: I0307 04:46:08.540318 4689 generic.go:334] "Generic (PLEG): container finished" podID="d737c925-6772-46fe-b0d5-3779bff2aea0" containerID="82cfaca1ff5f278d3e23be54e8f11c65112846a7b073872d411e80aa77868de6" exitCode=0 Mar 07 04:46:08 crc kubenswrapper[4689]: I0307 04:46:08.540394 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6l64j" event={"ID":"d737c925-6772-46fe-b0d5-3779bff2aea0","Type":"ContainerDied","Data":"82cfaca1ff5f278d3e23be54e8f11c65112846a7b073872d411e80aa77868de6"} Mar 07 04:46:08 crc kubenswrapper[4689]: I0307 04:46:08.624336 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:46:08 crc kubenswrapper[4689]: I0307 04:46:08.649963 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d737c925-6772-46fe-b0d5-3779bff2aea0-utilities\") pod \"d737c925-6772-46fe-b0d5-3779bff2aea0\" (UID: \"d737c925-6772-46fe-b0d5-3779bff2aea0\") " Mar 07 04:46:08 crc kubenswrapper[4689]: I0307 04:46:08.650139 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d737c925-6772-46fe-b0d5-3779bff2aea0-catalog-content\") pod \"d737c925-6772-46fe-b0d5-3779bff2aea0\" (UID: \"d737c925-6772-46fe-b0d5-3779bff2aea0\") " Mar 07 04:46:08 crc kubenswrapper[4689]: I0307 04:46:08.650306 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rjns\" (UniqueName: \"kubernetes.io/projected/d737c925-6772-46fe-b0d5-3779bff2aea0-kube-api-access-9rjns\") pod \"d737c925-6772-46fe-b0d5-3779bff2aea0\" (UID: \"d737c925-6772-46fe-b0d5-3779bff2aea0\") " Mar 07 04:46:08 crc kubenswrapper[4689]: I0307 04:46:08.652272 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d737c925-6772-46fe-b0d5-3779bff2aea0-utilities" (OuterVolumeSpecName: "utilities") pod "d737c925-6772-46fe-b0d5-3779bff2aea0" (UID: "d737c925-6772-46fe-b0d5-3779bff2aea0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:46:08 crc kubenswrapper[4689]: I0307 04:46:08.658691 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d737c925-6772-46fe-b0d5-3779bff2aea0-kube-api-access-9rjns" (OuterVolumeSpecName: "kube-api-access-9rjns") pod "d737c925-6772-46fe-b0d5-3779bff2aea0" (UID: "d737c925-6772-46fe-b0d5-3779bff2aea0"). InnerVolumeSpecName "kube-api-access-9rjns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:46:08 crc kubenswrapper[4689]: I0307 04:46:08.680430 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d737c925-6772-46fe-b0d5-3779bff2aea0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d737c925-6772-46fe-b0d5-3779bff2aea0" (UID: "d737c925-6772-46fe-b0d5-3779bff2aea0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:46:08 crc kubenswrapper[4689]: I0307 04:46:08.752857 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d737c925-6772-46fe-b0d5-3779bff2aea0-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:46:08 crc kubenswrapper[4689]: I0307 04:46:08.752912 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d737c925-6772-46fe-b0d5-3779bff2aea0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:46:08 crc kubenswrapper[4689]: I0307 04:46:08.752981 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rjns\" (UniqueName: \"kubernetes.io/projected/d737c925-6772-46fe-b0d5-3779bff2aea0-kube-api-access-9rjns\") on node \"crc\" DevicePath \"\"" Mar 07 04:46:09 crc kubenswrapper[4689]: I0307 04:46:09.552957 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6l64j" event={"ID":"d737c925-6772-46fe-b0d5-3779bff2aea0","Type":"ContainerDied","Data":"fa2dd73bbfaebace004f690402f3ddf58a835aae42badabdaa29ecc46f8a11ba"} Mar 07 04:46:09 crc kubenswrapper[4689]: I0307 04:46:09.553030 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6l64j" Mar 07 04:46:09 crc kubenswrapper[4689]: I0307 04:46:09.553265 4689 scope.go:117] "RemoveContainer" containerID="82cfaca1ff5f278d3e23be54e8f11c65112846a7b073872d411e80aa77868de6" Mar 07 04:46:09 crc kubenswrapper[4689]: I0307 04:46:09.570779 4689 scope.go:117] "RemoveContainer" containerID="08ae0bacc463aec25f4e142e5a19d9b453f6000b7d86085f227b078df444c0be" Mar 07 04:46:09 crc kubenswrapper[4689]: I0307 04:46:09.577767 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6l64j"] Mar 07 04:46:09 crc kubenswrapper[4689]: I0307 04:46:09.591858 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6l64j"] Mar 07 04:46:09 crc kubenswrapper[4689]: I0307 04:46:09.608259 4689 scope.go:117] "RemoveContainer" containerID="a4fb32bcead719c2d73412506b5eef63823aa3d0d8ba63b5521d5839c3f27dfd" Mar 07 04:46:09 crc kubenswrapper[4689]: I0307 04:46:09.827582 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:46:09 crc kubenswrapper[4689]: E0307 04:46:09.827828 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:46:09 crc kubenswrapper[4689]: I0307 04:46:09.833253 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d737c925-6772-46fe-b0d5-3779bff2aea0" path="/var/lib/kubelet/pods/d737c925-6772-46fe-b0d5-3779bff2aea0/volumes" Mar 07 04:46:10 crc kubenswrapper[4689]: E0307 04:46:10.678063 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:46:10 crc kubenswrapper[4689]: E0307 04:46:10.678345 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:47:14.678330651 +0000 UTC m=+1679.724714140 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:46:10 crc kubenswrapper[4689]: E0307 04:46:10.678104 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:46:10 crc kubenswrapper[4689]: E0307 04:46:10.678457 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:47:14.678436094 +0000 UTC m=+1679.724819663 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:46:11 crc kubenswrapper[4689]: I0307 04:46:11.639887 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:11 crc kubenswrapper[4689]: I0307 04:46:11.639934 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:11 crc kubenswrapper[4689]: I0307 04:46:11.676004 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:12 crc kubenswrapper[4689]: I0307 04:46:12.632067 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:12 crc kubenswrapper[4689]: I0307 04:46:12.919235 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbhd2"] Mar 07 04:46:14 crc kubenswrapper[4689]: I0307 04:46:14.587745 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vbhd2" podUID="4d801f99-7686-4be7-a3fb-4f1971ca3a1d" containerName="registry-server" containerID="cri-o://9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee" gracePeriod=2 Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.038348 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.140626 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-catalog-content\") pod \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\" (UID: \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\") " Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.140735 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5stzz\" (UniqueName: \"kubernetes.io/projected/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-kube-api-access-5stzz\") pod \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\" (UID: \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\") " Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.140836 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-utilities\") pod \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\" (UID: \"4d801f99-7686-4be7-a3fb-4f1971ca3a1d\") " Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.141773 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-utilities" (OuterVolumeSpecName: "utilities") pod "4d801f99-7686-4be7-a3fb-4f1971ca3a1d" (UID: "4d801f99-7686-4be7-a3fb-4f1971ca3a1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.146508 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-kube-api-access-5stzz" (OuterVolumeSpecName: "kube-api-access-5stzz") pod "4d801f99-7686-4be7-a3fb-4f1971ca3a1d" (UID: "4d801f99-7686-4be7-a3fb-4f1971ca3a1d"). InnerVolumeSpecName "kube-api-access-5stzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.212894 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d801f99-7686-4be7-a3fb-4f1971ca3a1d" (UID: "4d801f99-7686-4be7-a3fb-4f1971ca3a1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.243368 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.243404 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.243422 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5stzz\" (UniqueName: \"kubernetes.io/projected/4d801f99-7686-4be7-a3fb-4f1971ca3a1d-kube-api-access-5stzz\") on node \"crc\" DevicePath \"\"" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.605977 4689 generic.go:334] "Generic (PLEG): container finished" podID="4d801f99-7686-4be7-a3fb-4f1971ca3a1d" containerID="9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee" exitCode=0 Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.606025 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhd2" event={"ID":"4d801f99-7686-4be7-a3fb-4f1971ca3a1d","Type":"ContainerDied","Data":"9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee"} Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.606033 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbhd2" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.606059 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbhd2" event={"ID":"4d801f99-7686-4be7-a3fb-4f1971ca3a1d","Type":"ContainerDied","Data":"d1799acaf53e67116b029a3e3be323da66773530c1081482495170cbd36164a0"} Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.606080 4689 scope.go:117] "RemoveContainer" containerID="9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.638549 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbhd2"] Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.643776 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vbhd2"] Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.645143 4689 scope.go:117] "RemoveContainer" containerID="94f9ee9b4fec3148c4ab29c886031d7adc57ef5c1037342e12aebef8378c55e7" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.674303 4689 scope.go:117] "RemoveContainer" containerID="54664c938cd28ce1e0734d807397d6c8d69af52a6c3aaa73c787fd8d39fe404e" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.695753 4689 scope.go:117] "RemoveContainer" containerID="9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee" Mar 07 04:46:15 crc kubenswrapper[4689]: E0307 04:46:15.696458 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee\": container with ID starting with 9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee not found: ID does not exist" containerID="9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.696530 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee"} err="failed to get container status \"9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee\": rpc error: code = NotFound desc = could not find container \"9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee\": container with ID starting with 9bac33dc07af581fc773e62fb10ce30271297613cf13dd15a72c6a0a1be7a7ee not found: ID does not exist" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.696572 4689 scope.go:117] "RemoveContainer" containerID="94f9ee9b4fec3148c4ab29c886031d7adc57ef5c1037342e12aebef8378c55e7" Mar 07 04:46:15 crc kubenswrapper[4689]: E0307 04:46:15.697042 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94f9ee9b4fec3148c4ab29c886031d7adc57ef5c1037342e12aebef8378c55e7\": container with ID starting with 94f9ee9b4fec3148c4ab29c886031d7adc57ef5c1037342e12aebef8378c55e7 not found: ID does not exist" containerID="94f9ee9b4fec3148c4ab29c886031d7adc57ef5c1037342e12aebef8378c55e7" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.697115 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94f9ee9b4fec3148c4ab29c886031d7adc57ef5c1037342e12aebef8378c55e7"} err="failed to get container status \"94f9ee9b4fec3148c4ab29c886031d7adc57ef5c1037342e12aebef8378c55e7\": rpc error: code = NotFound desc = could not find container \"94f9ee9b4fec3148c4ab29c886031d7adc57ef5c1037342e12aebef8378c55e7\": container with ID starting with 94f9ee9b4fec3148c4ab29c886031d7adc57ef5c1037342e12aebef8378c55e7 not found: ID does not exist" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.697162 4689 scope.go:117] "RemoveContainer" containerID="54664c938cd28ce1e0734d807397d6c8d69af52a6c3aaa73c787fd8d39fe404e" Mar 07 04:46:15 crc kubenswrapper[4689]: E0307 04:46:15.697650 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54664c938cd28ce1e0734d807397d6c8d69af52a6c3aaa73c787fd8d39fe404e\": container with ID starting with 54664c938cd28ce1e0734d807397d6c8d69af52a6c3aaa73c787fd8d39fe404e not found: ID does not exist" containerID="54664c938cd28ce1e0734d807397d6c8d69af52a6c3aaa73c787fd8d39fe404e" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.697700 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54664c938cd28ce1e0734d807397d6c8d69af52a6c3aaa73c787fd8d39fe404e"} err="failed to get container status \"54664c938cd28ce1e0734d807397d6c8d69af52a6c3aaa73c787fd8d39fe404e\": rpc error: code = NotFound desc = could not find container \"54664c938cd28ce1e0734d807397d6c8d69af52a6c3aaa73c787fd8d39fe404e\": container with ID starting with 54664c938cd28ce1e0734d807397d6c8d69af52a6c3aaa73c787fd8d39fe404e not found: ID does not exist" Mar 07 04:46:15 crc kubenswrapper[4689]: I0307 04:46:15.833236 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d801f99-7686-4be7-a3fb-4f1971ca3a1d" path="/var/lib/kubelet/pods/4d801f99-7686-4be7-a3fb-4f1971ca3a1d/volumes" Mar 07 04:46:19 crc kubenswrapper[4689]: I0307 04:46:19.589093 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wblnn_ee57bd24-197d-4722-9a1a-a73e914a0973/control-plane-machine-set-operator/0.log" Mar 07 04:46:19 crc kubenswrapper[4689]: I0307 04:46:19.712437 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8ggcp_3ec0b40d-04d4-486b-93bc-361c72d74aad/machine-api-operator/0.log" Mar 07 04:46:19 crc kubenswrapper[4689]: I0307 04:46:19.760084 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8ggcp_3ec0b40d-04d4-486b-93bc-361c72d74aad/kube-rbac-proxy/0.log" Mar 07 04:46:23 crc kubenswrapper[4689]: I0307 04:46:23.826304 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:46:23 crc kubenswrapper[4689]: E0307 04:46:23.826872 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:46:38 crc kubenswrapper[4689]: I0307 04:46:38.826079 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:46:38 crc kubenswrapper[4689]: E0307 04:46:38.826796 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:46:48 crc kubenswrapper[4689]: I0307 04:46:48.651402 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-z2zk8_92ada8da-b00e-4106-8831-bfe7a78d4806/kube-rbac-proxy/0.log" Mar 07 04:46:48 crc kubenswrapper[4689]: I0307 04:46:48.690532 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-z2zk8_92ada8da-b00e-4106-8831-bfe7a78d4806/controller/0.log" Mar 07 04:46:48 crc kubenswrapper[4689]: I0307 04:46:48.787296 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-frr-files/0.log" Mar 07 04:46:48 crc kubenswrapper[4689]: I0307 04:46:48.953620 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-frr-files/0.log" Mar 07 04:46:48 crc kubenswrapper[4689]: I0307 04:46:48.989071 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-reloader/0.log" Mar 07 04:46:48 crc kubenswrapper[4689]: I0307 04:46:48.991553 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-metrics/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.006367 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-reloader/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.176034 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-frr-files/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.181778 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-reloader/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.205826 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-metrics/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.207501 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-metrics/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.378722 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-frr-files/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.382267 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-metrics/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.404631 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/controller/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.424202 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-reloader/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.569392 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/frr-metrics/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.572665 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/kube-rbac-proxy/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.613379 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/kube-rbac-proxy-frr/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.786857 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/reloader/0.log" Mar 07 04:46:49 crc kubenswrapper[4689]: I0307 04:46:49.914888 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-d5xx7_9f428eff-914b-4bee-a9ee-7399d39a38c0/frr-k8s-webhook-server/0.log" Mar 07 04:46:50 crc kubenswrapper[4689]: I0307 04:46:50.005149 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-77cb8466b4-dgs2t_c42d4852-f686-4b2c-a03e-735b386d752a/manager/0.log" Mar 07 04:46:50 crc kubenswrapper[4689]: I0307 04:46:50.134136 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-65b497c9c9-r86tt_3838aa56-d0d3-4bce-95d0-7e760c2be14b/webhook-server/0.log" Mar 07 04:46:50 crc kubenswrapper[4689]: I0307 04:46:50.269570 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mpvnx_c6fd9827-217f-4143-94c3-13c5c8257e98/kube-rbac-proxy/0.log" Mar 07 04:46:50 crc kubenswrapper[4689]: I0307 04:46:50.355104 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/frr/0.log" Mar 07 04:46:50 crc kubenswrapper[4689]: I0307 04:46:50.462919 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mpvnx_c6fd9827-217f-4143-94c3-13c5c8257e98/speaker/0.log" Mar 07 04:46:52 crc kubenswrapper[4689]: I0307 04:46:52.826243 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:46:52 crc kubenswrapper[4689]: E0307 04:46:52.826643 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:46:53 crc kubenswrapper[4689]: I0307 04:46:53.172816 4689 scope.go:117] "RemoveContainer" containerID="7404fcd03e14e2a6214723039b11259a7b435865c11742201d4bee45bb36582a" Mar 07 04:46:53 crc kubenswrapper[4689]: I0307 04:46:53.235595 4689 scope.go:117] "RemoveContainer" containerID="96ecd1348d8aedf7e7c82ab6d63a083426f803c2869df540471a05f43c70ebcf" Mar 07 04:46:53 crc kubenswrapper[4689]: I0307 04:46:53.254507 4689 scope.go:117] "RemoveContainer" containerID="6581163c011f345b4ceb2e523c7d3ca03cc42ac901270c85b57fd83e692a48e2" Mar 07 04:46:53 crc kubenswrapper[4689]: I0307 04:46:53.314072 4689 scope.go:117] "RemoveContainer" containerID="8c2fc50de57b0e9a6da42b42437cb93ae36d7629ec62c582a0cb40619c73a64c" Mar 07 04:46:53 crc kubenswrapper[4689]: I0307 04:46:53.331788 4689 scope.go:117] "RemoveContainer" containerID="cacb5db30963fd44e517647a02a03df3f2281fc0e783bb876f4cd09b706e1ca5" Mar 07 04:46:53 crc kubenswrapper[4689]: I0307 04:46:53.356641 4689 scope.go:117] "RemoveContainer" containerID="db77210edc7bd42b64114823658b941bdfdefc1225d3d7a4f8d6c3b89ed1fe89" Mar 07 04:46:53 crc kubenswrapper[4689]: I0307 04:46:53.389376 4689 scope.go:117] "RemoveContainer" containerID="e3022f74671f98f056db6c0e381371a3e55ee18b9e417b193de56354d2305b91" Mar 07 04:47:03 crc kubenswrapper[4689]: I0307 04:47:03.563596 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_3c25a937-0d93-4077-92d7-fbeac4f6abb3/openstackclient/0.log" Mar 07 04:47:07 crc kubenswrapper[4689]: I0307 04:47:07.825912 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:47:07 crc kubenswrapper[4689]: E0307 04:47:07.826339 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:47:14 crc kubenswrapper[4689]: E0307 04:47:14.724747 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:47:14 crc kubenswrapper[4689]: E0307 04:47:14.725238 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:49:16.725219054 +0000 UTC m=+1801.771602633 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:47:14 crc kubenswrapper[4689]: E0307 04:47:14.724879 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:47:14 crc kubenswrapper[4689]: E0307 04:47:14.725352 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:49:16.725318017 +0000 UTC m=+1801.771701556 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:47:17 crc kubenswrapper[4689]: I0307 04:47:17.338162 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/extract-utilities/0.log" Mar 07 04:47:17 crc kubenswrapper[4689]: I0307 04:47:17.602068 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/extract-utilities/0.log" Mar 07 04:47:17 crc kubenswrapper[4689]: I0307 04:47:17.647892 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/extract-content/0.log" Mar 07 04:47:17 crc kubenswrapper[4689]: I0307 04:47:17.651096 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/extract-content/0.log" Mar 07 04:47:17 crc kubenswrapper[4689]: I0307 04:47:17.739405 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/extract-utilities/0.log" Mar 07 04:47:17 crc kubenswrapper[4689]: I0307 04:47:17.769706 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/extract-content/0.log" Mar 07 04:47:17 crc kubenswrapper[4689]: I0307 04:47:17.981760 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/extract-utilities/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.096044 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/extract-utilities/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.126575 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/registry-server/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.145123 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/extract-content/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.183588 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/extract-content/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.329863 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/extract-utilities/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.346606 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/extract-content/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.527709 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/util/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.693714 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/pull/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.737705 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/registry-server/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.751807 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/util/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.770464 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/pull/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.895160 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/util/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.902893 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/pull/0.log" Mar 07 04:47:18 crc kubenswrapper[4689]: I0307 04:47:18.927592 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/extract/0.log" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.040771 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2qc7s_431626bb-08c9-4190-83e1-d4d5fd7cb198/marketplace-operator/0.log" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.122236 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/extract-utilities/0.log" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.242500 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/extract-content/0.log" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.244099 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/extract-utilities/0.log" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.318035 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/extract-content/0.log" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.451208 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/extract-utilities/0.log" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.461421 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/extract-content/0.log" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.476763 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/registry-server/0.log" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.629625 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/extract-utilities/0.log" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.794581 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/extract-content/0.log" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.812337 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/extract-utilities/0.log" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.825724 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:47:19 crc kubenswrapper[4689]: E0307 04:47:19.826000 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:47:19 crc kubenswrapper[4689]: I0307 04:47:19.840040 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/extract-content/0.log" Mar 07 04:47:20 crc kubenswrapper[4689]: I0307 04:47:20.001011 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/extract-content/0.log" Mar 07 04:47:20 crc kubenswrapper[4689]: I0307 04:47:20.017657 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/extract-utilities/0.log" Mar 07 04:47:20 crc kubenswrapper[4689]: I0307 04:47:20.366793 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/registry-server/0.log" Mar 07 04:47:32 crc kubenswrapper[4689]: I0307 04:47:32.825554 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:47:32 crc kubenswrapper[4689]: E0307 04:47:32.826332 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:47:47 crc kubenswrapper[4689]: I0307 04:47:47.826304 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:47:47 crc kubenswrapper[4689]: E0307 04:47:47.827242 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:47:53 crc kubenswrapper[4689]: I0307 04:47:53.518525 4689 scope.go:117] "RemoveContainer" containerID="d38bf174162378fb2da9da159d97d13ebf904fb880d611cad3eb253e712520b8" Mar 07 04:47:53 crc kubenswrapper[4689]: I0307 04:47:53.548980 4689 scope.go:117] "RemoveContainer" containerID="4a7450be8431eb55a40d28f678bab089af116d3d77f3e7540640b150e3601f72" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.158259 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547648-qjvvn"] Mar 07 04:48:00 crc kubenswrapper[4689]: E0307 04:48:00.159433 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305d3ac0-fe32-4daf-90bb-4a57426aed26" containerName="oc" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.159471 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="305d3ac0-fe32-4daf-90bb-4a57426aed26" containerName="oc" Mar 07 04:48:00 crc kubenswrapper[4689]: E0307 04:48:00.159499 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d801f99-7686-4be7-a3fb-4f1971ca3a1d" containerName="extract-utilities" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.159514 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d801f99-7686-4be7-a3fb-4f1971ca3a1d" containerName="extract-utilities" Mar 07 04:48:00 crc kubenswrapper[4689]: E0307 04:48:00.159531 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d801f99-7686-4be7-a3fb-4f1971ca3a1d" containerName="extract-content" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.159544 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d801f99-7686-4be7-a3fb-4f1971ca3a1d" containerName="extract-content" Mar 07 04:48:00 crc kubenswrapper[4689]: E0307 04:48:00.159575 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d737c925-6772-46fe-b0d5-3779bff2aea0" containerName="registry-server" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.159588 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d737c925-6772-46fe-b0d5-3779bff2aea0" containerName="registry-server" Mar 07 04:48:00 crc kubenswrapper[4689]: E0307 04:48:00.159610 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d737c925-6772-46fe-b0d5-3779bff2aea0" containerName="extract-utilities" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.159625 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d737c925-6772-46fe-b0d5-3779bff2aea0" containerName="extract-utilities" Mar 07 04:48:00 crc kubenswrapper[4689]: E0307 04:48:00.159650 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d737c925-6772-46fe-b0d5-3779bff2aea0" containerName="extract-content" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.159662 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d737c925-6772-46fe-b0d5-3779bff2aea0" containerName="extract-content" Mar 07 04:48:00 crc kubenswrapper[4689]: E0307 04:48:00.159686 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d801f99-7686-4be7-a3fb-4f1971ca3a1d" containerName="registry-server" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.159700 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d801f99-7686-4be7-a3fb-4f1971ca3a1d" containerName="registry-server" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.159887 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d801f99-7686-4be7-a3fb-4f1971ca3a1d" containerName="registry-server" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.159905 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d737c925-6772-46fe-b0d5-3779bff2aea0" containerName="registry-server" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.159927 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="305d3ac0-fe32-4daf-90bb-4a57426aed26" containerName="oc" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.160826 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547648-qjvvn" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.164653 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.164787 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.167252 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.178326 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547648-qjvvn"] Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.237057 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tbsl\" (UniqueName: \"kubernetes.io/projected/81670773-b472-461f-8b12-7d589c3442e6-kube-api-access-8tbsl\") pod \"auto-csr-approver-29547648-qjvvn\" (UID: \"81670773-b472-461f-8b12-7d589c3442e6\") " pod="openshift-infra/auto-csr-approver-29547648-qjvvn" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.338883 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tbsl\" (UniqueName: \"kubernetes.io/projected/81670773-b472-461f-8b12-7d589c3442e6-kube-api-access-8tbsl\") pod \"auto-csr-approver-29547648-qjvvn\" (UID: \"81670773-b472-461f-8b12-7d589c3442e6\") " pod="openshift-infra/auto-csr-approver-29547648-qjvvn" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.361943 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tbsl\" (UniqueName: \"kubernetes.io/projected/81670773-b472-461f-8b12-7d589c3442e6-kube-api-access-8tbsl\") pod \"auto-csr-approver-29547648-qjvvn\" (UID: \"81670773-b472-461f-8b12-7d589c3442e6\") " pod="openshift-infra/auto-csr-approver-29547648-qjvvn" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.508039 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547648-qjvvn" Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.733357 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547648-qjvvn"] Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.749256 4689 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 04:48:00 crc kubenswrapper[4689]: I0307 04:48:00.826283 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:48:00 crc kubenswrapper[4689]: E0307 04:48:00.826532 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:48:01 crc kubenswrapper[4689]: I0307 04:48:01.513735 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547648-qjvvn" event={"ID":"81670773-b472-461f-8b12-7d589c3442e6","Type":"ContainerStarted","Data":"2a9f8b75e252ec897b4fe87e5c66d85e3e5e5a59240db149120cd03acef3edad"} Mar 07 04:48:02 crc kubenswrapper[4689]: I0307 04:48:02.523233 4689 generic.go:334] "Generic (PLEG): container finished" podID="81670773-b472-461f-8b12-7d589c3442e6" containerID="ec8c07d09e7aa473f3218f9b8d32c92942d5ba8b6a143fd180e658cfefff9de5" exitCode=0 Mar 07 04:48:02 crc kubenswrapper[4689]: I0307 04:48:02.523317 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547648-qjvvn" event={"ID":"81670773-b472-461f-8b12-7d589c3442e6","Type":"ContainerDied","Data":"ec8c07d09e7aa473f3218f9b8d32c92942d5ba8b6a143fd180e658cfefff9de5"} Mar 07 04:48:03 crc kubenswrapper[4689]: I0307 04:48:03.848834 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547648-qjvvn" Mar 07 04:48:03 crc kubenswrapper[4689]: I0307 04:48:03.888742 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tbsl\" (UniqueName: \"kubernetes.io/projected/81670773-b472-461f-8b12-7d589c3442e6-kube-api-access-8tbsl\") pod \"81670773-b472-461f-8b12-7d589c3442e6\" (UID: \"81670773-b472-461f-8b12-7d589c3442e6\") " Mar 07 04:48:03 crc kubenswrapper[4689]: I0307 04:48:03.911417 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81670773-b472-461f-8b12-7d589c3442e6-kube-api-access-8tbsl" (OuterVolumeSpecName: "kube-api-access-8tbsl") pod "81670773-b472-461f-8b12-7d589c3442e6" (UID: "81670773-b472-461f-8b12-7d589c3442e6"). InnerVolumeSpecName "kube-api-access-8tbsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:48:03 crc kubenswrapper[4689]: I0307 04:48:03.996304 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tbsl\" (UniqueName: \"kubernetes.io/projected/81670773-b472-461f-8b12-7d589c3442e6-kube-api-access-8tbsl\") on node \"crc\" DevicePath \"\"" Mar 07 04:48:04 crc kubenswrapper[4689]: I0307 04:48:04.542849 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547648-qjvvn" event={"ID":"81670773-b472-461f-8b12-7d589c3442e6","Type":"ContainerDied","Data":"2a9f8b75e252ec897b4fe87e5c66d85e3e5e5a59240db149120cd03acef3edad"} Mar 07 04:48:04 crc kubenswrapper[4689]: I0307 04:48:04.543108 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a9f8b75e252ec897b4fe87e5c66d85e3e5e5a59240db149120cd03acef3edad" Mar 07 04:48:04 crc kubenswrapper[4689]: I0307 04:48:04.542934 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547648-qjvvn" Mar 07 04:48:04 crc kubenswrapper[4689]: I0307 04:48:04.911611 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547642-bpjt8"] Mar 07 04:48:04 crc kubenswrapper[4689]: I0307 04:48:04.919486 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547642-bpjt8"] Mar 07 04:48:05 crc kubenswrapper[4689]: I0307 04:48:05.837368 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4221dd55-156a-45e5-8de8-e5820ecb5f10" path="/var/lib/kubelet/pods/4221dd55-156a-45e5-8de8-e5820ecb5f10/volumes" Mar 07 04:48:11 crc kubenswrapper[4689]: I0307 04:48:11.825880 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:48:11 crc kubenswrapper[4689]: E0307 04:48:11.826985 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:48:23 crc kubenswrapper[4689]: I0307 04:48:23.826344 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:48:23 crc kubenswrapper[4689]: E0307 04:48:23.827295 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:48:31 crc kubenswrapper[4689]: I0307 04:48:31.801312 4689 generic.go:334] "Generic (PLEG): container finished" podID="667e4097-0a9c-40d6-a15a-c7a0066085ac" containerID="606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2" exitCode=0 Mar 07 04:48:31 crc kubenswrapper[4689]: I0307 04:48:31.801455 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdr7x/must-gather-kbnks" event={"ID":"667e4097-0a9c-40d6-a15a-c7a0066085ac","Type":"ContainerDied","Data":"606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2"} Mar 07 04:48:31 crc kubenswrapper[4689]: I0307 04:48:31.804881 4689 scope.go:117] "RemoveContainer" containerID="606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2" Mar 07 04:48:31 crc kubenswrapper[4689]: I0307 04:48:31.874951 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hdr7x_must-gather-kbnks_667e4097-0a9c-40d6-a15a-c7a0066085ac/gather/0.log" Mar 07 04:48:38 crc kubenswrapper[4689]: I0307 04:48:38.826068 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:48:38 crc kubenswrapper[4689]: E0307 04:48:38.827081 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:48:38 crc kubenswrapper[4689]: I0307 04:48:38.827663 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hdr7x/must-gather-kbnks"] Mar 07 04:48:38 crc kubenswrapper[4689]: I0307 04:48:38.827839 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hdr7x/must-gather-kbnks" podUID="667e4097-0a9c-40d6-a15a-c7a0066085ac" containerName="copy" containerID="cri-o://a48b6d2558cc15e286e2a64eb59f9aadc627de689e0162a127a3bb9b33e1f470" gracePeriod=2 Mar 07 04:48:38 crc kubenswrapper[4689]: I0307 04:48:38.833083 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hdr7x/must-gather-kbnks"] Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.170159 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hdr7x_must-gather-kbnks_667e4097-0a9c-40d6-a15a-c7a0066085ac/copy/0.log" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.171327 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdr7x/must-gather-kbnks" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.196487 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/667e4097-0a9c-40d6-a15a-c7a0066085ac-must-gather-output\") pod \"667e4097-0a9c-40d6-a15a-c7a0066085ac\" (UID: \"667e4097-0a9c-40d6-a15a-c7a0066085ac\") " Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.196601 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l99hm\" (UniqueName: \"kubernetes.io/projected/667e4097-0a9c-40d6-a15a-c7a0066085ac-kube-api-access-l99hm\") pod \"667e4097-0a9c-40d6-a15a-c7a0066085ac\" (UID: \"667e4097-0a9c-40d6-a15a-c7a0066085ac\") " Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.203246 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/667e4097-0a9c-40d6-a15a-c7a0066085ac-kube-api-access-l99hm" (OuterVolumeSpecName: "kube-api-access-l99hm") pod "667e4097-0a9c-40d6-a15a-c7a0066085ac" (UID: "667e4097-0a9c-40d6-a15a-c7a0066085ac"). InnerVolumeSpecName "kube-api-access-l99hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.294741 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/667e4097-0a9c-40d6-a15a-c7a0066085ac-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "667e4097-0a9c-40d6-a15a-c7a0066085ac" (UID: "667e4097-0a9c-40d6-a15a-c7a0066085ac"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.302656 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l99hm\" (UniqueName: \"kubernetes.io/projected/667e4097-0a9c-40d6-a15a-c7a0066085ac-kube-api-access-l99hm\") on node \"crc\" DevicePath \"\"" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.302700 4689 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/667e4097-0a9c-40d6-a15a-c7a0066085ac-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.838977 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="667e4097-0a9c-40d6-a15a-c7a0066085ac" path="/var/lib/kubelet/pods/667e4097-0a9c-40d6-a15a-c7a0066085ac/volumes" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.867571 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hdr7x_must-gather-kbnks_667e4097-0a9c-40d6-a15a-c7a0066085ac/copy/0.log" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.868211 4689 generic.go:334] "Generic (PLEG): container finished" podID="667e4097-0a9c-40d6-a15a-c7a0066085ac" containerID="a48b6d2558cc15e286e2a64eb59f9aadc627de689e0162a127a3bb9b33e1f470" exitCode=143 Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.868279 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdr7x/must-gather-kbnks" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.868288 4689 scope.go:117] "RemoveContainer" containerID="a48b6d2558cc15e286e2a64eb59f9aadc627de689e0162a127a3bb9b33e1f470" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.889699 4689 scope.go:117] "RemoveContainer" containerID="606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.950610 4689 scope.go:117] "RemoveContainer" containerID="a48b6d2558cc15e286e2a64eb59f9aadc627de689e0162a127a3bb9b33e1f470" Mar 07 04:48:39 crc kubenswrapper[4689]: E0307 04:48:39.951374 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a48b6d2558cc15e286e2a64eb59f9aadc627de689e0162a127a3bb9b33e1f470\": container with ID starting with a48b6d2558cc15e286e2a64eb59f9aadc627de689e0162a127a3bb9b33e1f470 not found: ID does not exist" containerID="a48b6d2558cc15e286e2a64eb59f9aadc627de689e0162a127a3bb9b33e1f470" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.951416 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48b6d2558cc15e286e2a64eb59f9aadc627de689e0162a127a3bb9b33e1f470"} err="failed to get container status \"a48b6d2558cc15e286e2a64eb59f9aadc627de689e0162a127a3bb9b33e1f470\": rpc error: code = NotFound desc = could not find container \"a48b6d2558cc15e286e2a64eb59f9aadc627de689e0162a127a3bb9b33e1f470\": container with ID starting with a48b6d2558cc15e286e2a64eb59f9aadc627de689e0162a127a3bb9b33e1f470 not found: ID does not exist" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.951446 4689 scope.go:117] "RemoveContainer" containerID="606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2" Mar 07 04:48:39 crc kubenswrapper[4689]: E0307 04:48:39.951827 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2\": container with ID starting with 606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2 not found: ID does not exist" containerID="606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2" Mar 07 04:48:39 crc kubenswrapper[4689]: I0307 04:48:39.951871 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2"} err="failed to get container status \"606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2\": rpc error: code = NotFound desc = could not find container \"606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2\": container with ID starting with 606449994c872228aa374e7c3c51ee6cfdb44e70925f5034bd613033a3e0afc2 not found: ID does not exist" Mar 07 04:48:50 crc kubenswrapper[4689]: I0307 04:48:50.826640 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:48:50 crc kubenswrapper[4689]: E0307 04:48:50.827574 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:48:53 crc kubenswrapper[4689]: I0307 04:48:53.648951 4689 scope.go:117] "RemoveContainer" containerID="bb343f2f9166b9bc4959d314ca3264486f218ba0a85de9b3452e5d85f221c45f" Mar 07 04:48:53 crc kubenswrapper[4689]: I0307 04:48:53.683108 4689 scope.go:117] "RemoveContainer" containerID="6618bca92086fa9df2e83b3682a84d1f46d3dbb360d76a9572497e866fc42d5a" Mar 07 04:48:53 crc kubenswrapper[4689]: I0307 04:48:53.767698 4689 scope.go:117] "RemoveContainer" containerID="7d55038de70538245d5fde3aa9812ce8fabefd8263074eca56f8e3c1112ca79e" Mar 07 04:48:53 crc kubenswrapper[4689]: I0307 04:48:53.788429 4689 scope.go:117] "RemoveContainer" containerID="2b0bc0ffbe7e65000809212717bbdfd5cff4a845cbd284041e59b727bbd42d89" Mar 07 04:48:53 crc kubenswrapper[4689]: I0307 04:48:53.809214 4689 scope.go:117] "RemoveContainer" containerID="55ba98ad14b1559c0c895a472e1d83598f8fa20fb19f81b4c7f62b161471175f" Mar 07 04:48:53 crc kubenswrapper[4689]: I0307 04:48:53.837103 4689 scope.go:117] "RemoveContainer" containerID="fae9eaad49e77e33096b5f19d267d45f257700c94bf685adb23a624bdff28d38" Mar 07 04:48:53 crc kubenswrapper[4689]: I0307 04:48:53.857335 4689 scope.go:117] "RemoveContainer" containerID="b28d5dd8095fbae7ccc0311ee6792a744df56ab770cb0df72557db94cb219e45" Mar 07 04:49:01 crc kubenswrapper[4689]: I0307 04:49:01.827009 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:49:01 crc kubenswrapper[4689]: E0307 04:49:01.827910 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:49:14 crc kubenswrapper[4689]: I0307 04:49:14.826090 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:49:14 crc kubenswrapper[4689]: E0307 04:49:14.827361 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:49:16 crc kubenswrapper[4689]: E0307 04:49:16.765933 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:49:16 crc kubenswrapper[4689]: E0307 04:49:16.766342 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:51:18.76632251 +0000 UTC m=+1923.812705999 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:49:16 crc kubenswrapper[4689]: E0307 04:49:16.766726 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:49:16 crc kubenswrapper[4689]: E0307 04:49:16.766758 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:51:18.766748432 +0000 UTC m=+1923.813131921 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:49:26 crc kubenswrapper[4689]: I0307 04:49:26.828292 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:49:26 crc kubenswrapper[4689]: E0307 04:49:26.829356 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:49:40 crc kubenswrapper[4689]: I0307 04:49:40.825734 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:49:40 crc kubenswrapper[4689]: E0307 04:49:40.826817 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:49:51 crc kubenswrapper[4689]: I0307 04:49:51.825343 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:49:51 crc kubenswrapper[4689]: E0307 04:49:51.826129 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:49:53 crc kubenswrapper[4689]: I0307 04:49:53.981796 4689 scope.go:117] "RemoveContainer" containerID="485bf29ff602d11194b64881629d8f1a13fac4096de96657cfdfffc1a68505ce" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.148838 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547650-96hvh"] Mar 07 04:50:00 crc kubenswrapper[4689]: E0307 04:50:00.150254 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667e4097-0a9c-40d6-a15a-c7a0066085ac" containerName="copy" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.150287 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="667e4097-0a9c-40d6-a15a-c7a0066085ac" containerName="copy" Mar 07 04:50:00 crc kubenswrapper[4689]: E0307 04:50:00.150312 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667e4097-0a9c-40d6-a15a-c7a0066085ac" containerName="gather" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.150324 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="667e4097-0a9c-40d6-a15a-c7a0066085ac" containerName="gather" Mar 07 04:50:00 crc kubenswrapper[4689]: E0307 04:50:00.150356 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81670773-b472-461f-8b12-7d589c3442e6" containerName="oc" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.150369 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="81670773-b472-461f-8b12-7d589c3442e6" containerName="oc" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.150533 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="81670773-b472-461f-8b12-7d589c3442e6" containerName="oc" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.150553 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="667e4097-0a9c-40d6-a15a-c7a0066085ac" containerName="gather" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.150576 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="667e4097-0a9c-40d6-a15a-c7a0066085ac" containerName="copy" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.151111 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547650-96hvh" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.155549 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.155761 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.157666 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.163404 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547650-96hvh"] Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.323436 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4n8l\" (UniqueName: \"kubernetes.io/projected/d8c4a9dd-74e6-415a-9d04-783a003dd6e7-kube-api-access-p4n8l\") pod \"auto-csr-approver-29547650-96hvh\" (UID: \"d8c4a9dd-74e6-415a-9d04-783a003dd6e7\") " pod="openshift-infra/auto-csr-approver-29547650-96hvh" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.425356 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4n8l\" (UniqueName: \"kubernetes.io/projected/d8c4a9dd-74e6-415a-9d04-783a003dd6e7-kube-api-access-p4n8l\") pod \"auto-csr-approver-29547650-96hvh\" (UID: \"d8c4a9dd-74e6-415a-9d04-783a003dd6e7\") " pod="openshift-infra/auto-csr-approver-29547650-96hvh" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.459414 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4n8l\" (UniqueName: \"kubernetes.io/projected/d8c4a9dd-74e6-415a-9d04-783a003dd6e7-kube-api-access-p4n8l\") pod \"auto-csr-approver-29547650-96hvh\" (UID: \"d8c4a9dd-74e6-415a-9d04-783a003dd6e7\") " pod="openshift-infra/auto-csr-approver-29547650-96hvh" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.489672 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547650-96hvh" Mar 07 04:50:00 crc kubenswrapper[4689]: I0307 04:50:00.753492 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547650-96hvh"] Mar 07 04:50:01 crc kubenswrapper[4689]: I0307 04:50:01.528158 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547650-96hvh" event={"ID":"d8c4a9dd-74e6-415a-9d04-783a003dd6e7","Type":"ContainerStarted","Data":"ed931c94e5da2eef2a65c67fddfaf67d30586c8fd76a6bb7f3a0b5b6c540668b"} Mar 07 04:50:02 crc kubenswrapper[4689]: I0307 04:50:02.536821 4689 generic.go:334] "Generic (PLEG): container finished" podID="d8c4a9dd-74e6-415a-9d04-783a003dd6e7" containerID="ed3222efb5f143fd87f78213a58a7c30dc8d4c5b626246d71c6e9ea2d50ef439" exitCode=0 Mar 07 04:50:02 crc kubenswrapper[4689]: I0307 04:50:02.537082 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547650-96hvh" event={"ID":"d8c4a9dd-74e6-415a-9d04-783a003dd6e7","Type":"ContainerDied","Data":"ed3222efb5f143fd87f78213a58a7c30dc8d4c5b626246d71c6e9ea2d50ef439"} Mar 07 04:50:02 crc kubenswrapper[4689]: I0307 04:50:02.825921 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:50:02 crc kubenswrapper[4689]: E0307 04:50:02.826529 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:50:03 crc kubenswrapper[4689]: I0307 04:50:03.830401 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547650-96hvh" Mar 07 04:50:03 crc kubenswrapper[4689]: I0307 04:50:03.982732 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4n8l\" (UniqueName: \"kubernetes.io/projected/d8c4a9dd-74e6-415a-9d04-783a003dd6e7-kube-api-access-p4n8l\") pod \"d8c4a9dd-74e6-415a-9d04-783a003dd6e7\" (UID: \"d8c4a9dd-74e6-415a-9d04-783a003dd6e7\") " Mar 07 04:50:03 crc kubenswrapper[4689]: I0307 04:50:03.988534 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c4a9dd-74e6-415a-9d04-783a003dd6e7-kube-api-access-p4n8l" (OuterVolumeSpecName: "kube-api-access-p4n8l") pod "d8c4a9dd-74e6-415a-9d04-783a003dd6e7" (UID: "d8c4a9dd-74e6-415a-9d04-783a003dd6e7"). InnerVolumeSpecName "kube-api-access-p4n8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:50:04 crc kubenswrapper[4689]: I0307 04:50:04.084284 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4n8l\" (UniqueName: \"kubernetes.io/projected/d8c4a9dd-74e6-415a-9d04-783a003dd6e7-kube-api-access-p4n8l\") on node \"crc\" DevicePath \"\"" Mar 07 04:50:04 crc kubenswrapper[4689]: I0307 04:50:04.555730 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547650-96hvh" event={"ID":"d8c4a9dd-74e6-415a-9d04-783a003dd6e7","Type":"ContainerDied","Data":"ed931c94e5da2eef2a65c67fddfaf67d30586c8fd76a6bb7f3a0b5b6c540668b"} Mar 07 04:50:04 crc kubenswrapper[4689]: I0307 04:50:04.555790 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed931c94e5da2eef2a65c67fddfaf67d30586c8fd76a6bb7f3a0b5b6c540668b" Mar 07 04:50:04 crc kubenswrapper[4689]: I0307 04:50:04.555796 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547650-96hvh" Mar 07 04:50:04 crc kubenswrapper[4689]: I0307 04:50:04.904783 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547644-s9mfn"] Mar 07 04:50:04 crc kubenswrapper[4689]: I0307 04:50:04.910059 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547644-s9mfn"] Mar 07 04:50:05 crc kubenswrapper[4689]: I0307 04:50:05.839315 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3603b5-9735-4260-8976-8589b3013d8d" path="/var/lib/kubelet/pods/ba3603b5-9735-4260-8976-8589b3013d8d/volumes" Mar 07 04:50:13 crc kubenswrapper[4689]: I0307 04:50:13.825728 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:50:13 crc kubenswrapper[4689]: E0307 04:50:13.826620 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:50:27 crc kubenswrapper[4689]: I0307 04:50:27.827237 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:50:27 crc kubenswrapper[4689]: E0307 04:50:27.828073 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:50:40 crc kubenswrapper[4689]: I0307 04:50:40.826219 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:50:41 crc kubenswrapper[4689]: I0307 04:50:41.841086 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerStarted","Data":"fb4b2124c937a1be1975e56b486e30c77111eb4ae794931d0065d01fdf7d1cc6"} Mar 07 04:50:54 crc kubenswrapper[4689]: I0307 04:50:54.075103 4689 scope.go:117] "RemoveContainer" containerID="ddcf9a57f6ae602aa55f457de587e1eb15f5211939b9ceb36d8b2bb7207f1422" Mar 07 04:50:54 crc kubenswrapper[4689]: I0307 04:50:54.108056 4689 scope.go:117] "RemoveContainer" containerID="301c5d60131d107d73acd8f94df46d4c74660137febf8822678d498d3d23e1af" Mar 07 04:51:18 crc kubenswrapper[4689]: E0307 04:51:18.831097 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:51:18 crc kubenswrapper[4689]: E0307 04:51:18.832054 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:53:20.832016444 +0000 UTC m=+2045.878400013 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:51:18 crc kubenswrapper[4689]: E0307 04:51:18.831147 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:51:18 crc kubenswrapper[4689]: E0307 04:51:18.832241 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:53:20.832206219 +0000 UTC m=+2045.878589738 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.272342 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-97nfh/must-gather-4mw4v"] Mar 07 04:51:27 crc kubenswrapper[4689]: E0307 04:51:27.273022 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c4a9dd-74e6-415a-9d04-783a003dd6e7" containerName="oc" Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.273036 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c4a9dd-74e6-415a-9d04-783a003dd6e7" containerName="oc" Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.273185 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c4a9dd-74e6-415a-9d04-783a003dd6e7" containerName="oc" Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.273885 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-97nfh/must-gather-4mw4v" Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.276204 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-97nfh"/"openshift-service-ca.crt" Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.276755 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-97nfh"/"kube-root-ca.crt" Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.343155 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-97nfh/must-gather-4mw4v"] Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.367859 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d940c90-cfda-4c9e-9066-a5258ce6604c-must-gather-output\") pod \"must-gather-4mw4v\" (UID: \"8d940c90-cfda-4c9e-9066-a5258ce6604c\") " pod="openshift-must-gather-97nfh/must-gather-4mw4v" Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.368003 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgbzg\" (UniqueName: \"kubernetes.io/projected/8d940c90-cfda-4c9e-9066-a5258ce6604c-kube-api-access-hgbzg\") pod \"must-gather-4mw4v\" (UID: \"8d940c90-cfda-4c9e-9066-a5258ce6604c\") " pod="openshift-must-gather-97nfh/must-gather-4mw4v" Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.469676 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgbzg\" (UniqueName: \"kubernetes.io/projected/8d940c90-cfda-4c9e-9066-a5258ce6604c-kube-api-access-hgbzg\") pod \"must-gather-4mw4v\" (UID: \"8d940c90-cfda-4c9e-9066-a5258ce6604c\") " pod="openshift-must-gather-97nfh/must-gather-4mw4v" Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.470061 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d940c90-cfda-4c9e-9066-a5258ce6604c-must-gather-output\") pod \"must-gather-4mw4v\" (UID: \"8d940c90-cfda-4c9e-9066-a5258ce6604c\") " pod="openshift-must-gather-97nfh/must-gather-4mw4v" Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.470600 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d940c90-cfda-4c9e-9066-a5258ce6604c-must-gather-output\") pod \"must-gather-4mw4v\" (UID: \"8d940c90-cfda-4c9e-9066-a5258ce6604c\") " pod="openshift-must-gather-97nfh/must-gather-4mw4v" Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.490705 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgbzg\" (UniqueName: \"kubernetes.io/projected/8d940c90-cfda-4c9e-9066-a5258ce6604c-kube-api-access-hgbzg\") pod \"must-gather-4mw4v\" (UID: \"8d940c90-cfda-4c9e-9066-a5258ce6604c\") " pod="openshift-must-gather-97nfh/must-gather-4mw4v" Mar 07 04:51:27 crc kubenswrapper[4689]: I0307 04:51:27.596395 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-97nfh/must-gather-4mw4v" Mar 07 04:51:28 crc kubenswrapper[4689]: I0307 04:51:28.056369 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-97nfh/must-gather-4mw4v"] Mar 07 04:51:28 crc kubenswrapper[4689]: I0307 04:51:28.300343 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-97nfh/must-gather-4mw4v" event={"ID":"8d940c90-cfda-4c9e-9066-a5258ce6604c","Type":"ContainerStarted","Data":"ee1f7fc76b4da88c185437b765ab2c212c609a2fe029de183e19fa4bcf348b14"} Mar 07 04:51:29 crc kubenswrapper[4689]: I0307 04:51:29.308394 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-97nfh/must-gather-4mw4v" event={"ID":"8d940c90-cfda-4c9e-9066-a5258ce6604c","Type":"ContainerStarted","Data":"f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba"} Mar 07 04:51:29 crc kubenswrapper[4689]: I0307 04:51:29.308440 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-97nfh/must-gather-4mw4v" event={"ID":"8d940c90-cfda-4c9e-9066-a5258ce6604c","Type":"ContainerStarted","Data":"97863b38f9f39863f8c104726314846964f876b978e0ba93c1e176a40282dd36"} Mar 07 04:51:29 crc kubenswrapper[4689]: I0307 04:51:29.332421 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-97nfh/must-gather-4mw4v" podStartSLOduration=2.332397576 podStartE2EDuration="2.332397576s" podCreationTimestamp="2026-03-07 04:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 04:51:29.329775115 +0000 UTC m=+1934.376158604" watchObservedRunningTime="2026-03-07 04:51:29.332397576 +0000 UTC m=+1934.378781095" Mar 07 04:52:00 crc kubenswrapper[4689]: I0307 04:52:00.150868 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547652-wtd2s"] Mar 07 04:52:00 crc kubenswrapper[4689]: I0307 04:52:00.152228 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547652-wtd2s" Mar 07 04:52:00 crc kubenswrapper[4689]: I0307 04:52:00.154122 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:52:00 crc kubenswrapper[4689]: I0307 04:52:00.154298 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:52:00 crc kubenswrapper[4689]: I0307 04:52:00.156521 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:52:00 crc kubenswrapper[4689]: I0307 04:52:00.166440 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547652-wtd2s"] Mar 07 04:52:00 crc kubenswrapper[4689]: I0307 04:52:00.321900 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h85f\" (UniqueName: \"kubernetes.io/projected/6410a1f0-08d7-40d5-882d-508e8850a319-kube-api-access-2h85f\") pod \"auto-csr-approver-29547652-wtd2s\" (UID: \"6410a1f0-08d7-40d5-882d-508e8850a319\") " pod="openshift-infra/auto-csr-approver-29547652-wtd2s" Mar 07 04:52:00 crc kubenswrapper[4689]: I0307 04:52:00.423084 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h85f\" (UniqueName: \"kubernetes.io/projected/6410a1f0-08d7-40d5-882d-508e8850a319-kube-api-access-2h85f\") pod \"auto-csr-approver-29547652-wtd2s\" (UID: \"6410a1f0-08d7-40d5-882d-508e8850a319\") " pod="openshift-infra/auto-csr-approver-29547652-wtd2s" Mar 07 04:52:00 crc kubenswrapper[4689]: I0307 04:52:00.441158 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h85f\" (UniqueName: \"kubernetes.io/projected/6410a1f0-08d7-40d5-882d-508e8850a319-kube-api-access-2h85f\") pod \"auto-csr-approver-29547652-wtd2s\" (UID: \"6410a1f0-08d7-40d5-882d-508e8850a319\") " pod="openshift-infra/auto-csr-approver-29547652-wtd2s" Mar 07 04:52:00 crc kubenswrapper[4689]: I0307 04:52:00.473712 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547652-wtd2s" Mar 07 04:52:00 crc kubenswrapper[4689]: I0307 04:52:00.688527 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547652-wtd2s"] Mar 07 04:52:00 crc kubenswrapper[4689]: W0307 04:52:00.699934 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6410a1f0_08d7_40d5_882d_508e8850a319.slice/crio-414305a07bcb3e61a4384fb89ab3abf7980ebae0be0795a3f277731d2b81a52a WatchSource:0}: Error finding container 414305a07bcb3e61a4384fb89ab3abf7980ebae0be0795a3f277731d2b81a52a: Status 404 returned error can't find the container with id 414305a07bcb3e61a4384fb89ab3abf7980ebae0be0795a3f277731d2b81a52a Mar 07 04:52:01 crc kubenswrapper[4689]: I0307 04:52:01.514256 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547652-wtd2s" event={"ID":"6410a1f0-08d7-40d5-882d-508e8850a319","Type":"ContainerStarted","Data":"414305a07bcb3e61a4384fb89ab3abf7980ebae0be0795a3f277731d2b81a52a"} Mar 07 04:52:03 crc kubenswrapper[4689]: I0307 04:52:03.545106 4689 generic.go:334] "Generic (PLEG): container finished" podID="6410a1f0-08d7-40d5-882d-508e8850a319" containerID="8aee6bd6c281cb8b810e0425f891a8e4337cc965d0bdc39e34b21a3b71037ed2" exitCode=0 Mar 07 04:52:03 crc kubenswrapper[4689]: I0307 04:52:03.545212 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547652-wtd2s" event={"ID":"6410a1f0-08d7-40d5-882d-508e8850a319","Type":"ContainerDied","Data":"8aee6bd6c281cb8b810e0425f891a8e4337cc965d0bdc39e34b21a3b71037ed2"} Mar 07 04:52:04 crc kubenswrapper[4689]: I0307 04:52:04.789630 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547652-wtd2s" Mar 07 04:52:04 crc kubenswrapper[4689]: I0307 04:52:04.985159 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h85f\" (UniqueName: \"kubernetes.io/projected/6410a1f0-08d7-40d5-882d-508e8850a319-kube-api-access-2h85f\") pod \"6410a1f0-08d7-40d5-882d-508e8850a319\" (UID: \"6410a1f0-08d7-40d5-882d-508e8850a319\") " Mar 07 04:52:05 crc kubenswrapper[4689]: I0307 04:52:05.005430 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6410a1f0-08d7-40d5-882d-508e8850a319-kube-api-access-2h85f" (OuterVolumeSpecName: "kube-api-access-2h85f") pod "6410a1f0-08d7-40d5-882d-508e8850a319" (UID: "6410a1f0-08d7-40d5-882d-508e8850a319"). InnerVolumeSpecName "kube-api-access-2h85f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:52:05 crc kubenswrapper[4689]: I0307 04:52:05.087079 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h85f\" (UniqueName: \"kubernetes.io/projected/6410a1f0-08d7-40d5-882d-508e8850a319-kube-api-access-2h85f\") on node \"crc\" DevicePath \"\"" Mar 07 04:52:05 crc kubenswrapper[4689]: I0307 04:52:05.560635 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547652-wtd2s" event={"ID":"6410a1f0-08d7-40d5-882d-508e8850a319","Type":"ContainerDied","Data":"414305a07bcb3e61a4384fb89ab3abf7980ebae0be0795a3f277731d2b81a52a"} Mar 07 04:52:05 crc kubenswrapper[4689]: I0307 04:52:05.560680 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="414305a07bcb3e61a4384fb89ab3abf7980ebae0be0795a3f277731d2b81a52a" Mar 07 04:52:05 crc kubenswrapper[4689]: I0307 04:52:05.560764 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547652-wtd2s" Mar 07 04:52:05 crc kubenswrapper[4689]: I0307 04:52:05.862407 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547646-p8rj9"] Mar 07 04:52:05 crc kubenswrapper[4689]: I0307 04:52:05.867103 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547646-p8rj9"] Mar 07 04:52:06 crc kubenswrapper[4689]: I0307 04:52:06.744653 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/util/0.log" Mar 07 04:52:06 crc kubenswrapper[4689]: I0307 04:52:06.930862 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/util/0.log" Mar 07 04:52:06 crc kubenswrapper[4689]: I0307 04:52:06.969466 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/pull/0.log" Mar 07 04:52:06 crc kubenswrapper[4689]: I0307 04:52:06.996061 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/pull/0.log" Mar 07 04:52:07 crc kubenswrapper[4689]: I0307 04:52:07.190478 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/extract/0.log" Mar 07 04:52:07 crc kubenswrapper[4689]: I0307 04:52:07.204230 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/util/0.log" Mar 07 04:52:07 crc kubenswrapper[4689]: I0307 04:52:07.205269 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a7c0211d8a1f1e3c52c34db6556cc5a1315493c9644a4b072e08eca7c14wvgk_baf61d7b-9301-4e93-ba1f-60d19c9497d2/pull/0.log" Mar 07 04:52:07 crc kubenswrapper[4689]: I0307 04:52:07.490745 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-2rfwv_4fbda293-a134-43ca-8f42-6bc32bae4b57/registry-server/0.log" Mar 07 04:52:07 crc kubenswrapper[4689]: I0307 04:52:07.551965 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5df65d59b6-8hmtq_96b13de3-e5e2-456c-8b92-fb7adb492a65/manager/0.log" Mar 07 04:52:07 crc kubenswrapper[4689]: I0307 04:52:07.833566 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305d3ac0-fe32-4daf-90bb-4a57426aed26" path="/var/lib/kubelet/pods/305d3ac0-fe32-4daf-90bb-4a57426aed26/volumes" Mar 07 04:52:20 crc kubenswrapper[4689]: I0307 04:52:20.857083 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wblnn_ee57bd24-197d-4722-9a1a-a73e914a0973/control-plane-machine-set-operator/0.log" Mar 07 04:52:21 crc kubenswrapper[4689]: I0307 04:52:21.005535 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8ggcp_3ec0b40d-04d4-486b-93bc-361c72d74aad/kube-rbac-proxy/0.log" Mar 07 04:52:21 crc kubenswrapper[4689]: I0307 04:52:21.043986 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8ggcp_3ec0b40d-04d4-486b-93bc-361c72d74aad/machine-api-operator/0.log" Mar 07 04:52:49 crc kubenswrapper[4689]: I0307 04:52:49.725105 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-z2zk8_92ada8da-b00e-4106-8831-bfe7a78d4806/kube-rbac-proxy/0.log" Mar 07 04:52:49 crc kubenswrapper[4689]: I0307 04:52:49.740328 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-z2zk8_92ada8da-b00e-4106-8831-bfe7a78d4806/controller/0.log" Mar 07 04:52:49 crc kubenswrapper[4689]: I0307 04:52:49.853792 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-frr-files/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.029720 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-reloader/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.059591 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-reloader/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.080269 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-metrics/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.086771 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-frr-files/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.248762 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-frr-files/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.289745 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-metrics/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.291299 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-reloader/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.312620 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-metrics/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.457851 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-metrics/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.477107 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-frr-files/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.492729 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/cp-reloader/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.498861 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/controller/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.640398 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/frr-metrics/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.672862 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/kube-rbac-proxy-frr/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.703418 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/kube-rbac-proxy/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.804742 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/reloader/0.log" Mar 07 04:52:50 crc kubenswrapper[4689]: I0307 04:52:50.895491 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-d5xx7_9f428eff-914b-4bee-a9ee-7399d39a38c0/frr-k8s-webhook-server/0.log" Mar 07 04:52:51 crc kubenswrapper[4689]: I0307 04:52:51.029529 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-77cb8466b4-dgs2t_c42d4852-f686-4b2c-a03e-735b386d752a/manager/0.log" Mar 07 04:52:51 crc kubenswrapper[4689]: I0307 04:52:51.216990 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-65b497c9c9-r86tt_3838aa56-d0d3-4bce-95d0-7e760c2be14b/webhook-server/0.log" Mar 07 04:52:51 crc kubenswrapper[4689]: I0307 04:52:51.282423 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mpvnx_c6fd9827-217f-4143-94c3-13c5c8257e98/kube-rbac-proxy/0.log" Mar 07 04:52:51 crc kubenswrapper[4689]: I0307 04:52:51.532628 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rgs5v_904924fa-b259-4cf4-8296-a7534f087102/frr/0.log" Mar 07 04:52:51 crc kubenswrapper[4689]: I0307 04:52:51.565427 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mpvnx_c6fd9827-217f-4143-94c3-13c5c8257e98/speaker/0.log" Mar 07 04:52:54 crc kubenswrapper[4689]: I0307 04:52:54.216722 4689 scope.go:117] "RemoveContainer" containerID="1f48436e09c96bae6a439165eb8ed88a9d65798e71a8632de5e6cfa33f44821d" Mar 07 04:52:59 crc kubenswrapper[4689]: I0307 04:52:59.189430 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:52:59 crc kubenswrapper[4689]: I0307 04:52:59.189852 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:53:03 crc kubenswrapper[4689]: I0307 04:53:03.960355 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_3c25a937-0d93-4077-92d7-fbeac4f6abb3/openstackclient/0.log" Mar 07 04:53:16 crc kubenswrapper[4689]: I0307 04:53:16.812774 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/extract-utilities/0.log" Mar 07 04:53:16 crc kubenswrapper[4689]: I0307 04:53:16.992890 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/extract-utilities/0.log" Mar 07 04:53:17 crc kubenswrapper[4689]: I0307 04:53:17.016257 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/extract-content/0.log" Mar 07 04:53:17 crc kubenswrapper[4689]: I0307 04:53:17.070500 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/extract-content/0.log" Mar 07 04:53:17 crc kubenswrapper[4689]: I0307 04:53:17.197756 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/extract-utilities/0.log" Mar 07 04:53:17 crc kubenswrapper[4689]: I0307 04:53:17.216609 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/extract-content/0.log" Mar 07 04:53:17 crc kubenswrapper[4689]: I0307 04:53:17.388081 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/extract-utilities/0.log" Mar 07 04:53:17 crc kubenswrapper[4689]: I0307 04:53:17.579625 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4xc7w_8ba6dceb-a52c-4108-af6e-ca861cdff2d9/registry-server/0.log" Mar 07 04:53:17 crc kubenswrapper[4689]: I0307 04:53:17.579810 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/extract-content/0.log" Mar 07 04:53:17 crc kubenswrapper[4689]: I0307 04:53:17.609370 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/extract-content/0.log" Mar 07 04:53:17 crc kubenswrapper[4689]: I0307 04:53:17.635209 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/extract-utilities/0.log" Mar 07 04:53:17 crc kubenswrapper[4689]: I0307 04:53:17.752773 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/extract-content/0.log" Mar 07 04:53:17 crc kubenswrapper[4689]: I0307 04:53:17.753936 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/extract-utilities/0.log" Mar 07 04:53:17 crc kubenswrapper[4689]: I0307 04:53:17.941450 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/util/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.130535 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/pull/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.174344 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4qkv8_4a76727a-27ba-4d05-92cf-01ec595c6989/registry-server/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.175344 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/pull/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.180091 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/util/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.402356 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/util/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.432454 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/extract/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.453054 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tp2df_a69229a6-7e04-4039-b08e-09cef56b36ba/pull/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.551003 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2qc7s_431626bb-08c9-4190-83e1-d4d5fd7cb198/marketplace-operator/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.584968 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/extract-utilities/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.792403 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/extract-utilities/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.802013 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/extract-content/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.827556 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/extract-content/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.939915 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/extract-content/0.log" Mar 07 04:53:18 crc kubenswrapper[4689]: I0307 04:53:18.952850 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/extract-utilities/0.log" Mar 07 04:53:19 crc kubenswrapper[4689]: I0307 04:53:19.037803 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g6vbv_f58e77c1-4fe5-4b43-bd3c-babc094119f0/registry-server/0.log" Mar 07 04:53:19 crc kubenswrapper[4689]: I0307 04:53:19.091192 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/extract-utilities/0.log" Mar 07 04:53:19 crc kubenswrapper[4689]: I0307 04:53:19.299964 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/extract-utilities/0.log" Mar 07 04:53:19 crc kubenswrapper[4689]: I0307 04:53:19.302458 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/extract-content/0.log" Mar 07 04:53:19 crc kubenswrapper[4689]: I0307 04:53:19.310574 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/extract-content/0.log" Mar 07 04:53:19 crc kubenswrapper[4689]: I0307 04:53:19.505627 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/extract-utilities/0.log" Mar 07 04:53:19 crc kubenswrapper[4689]: I0307 04:53:19.518239 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/extract-content/0.log" Mar 07 04:53:19 crc kubenswrapper[4689]: I0307 04:53:19.865528 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8cclj_769da50c-a6db-491d-90d7-146ac186dad8/registry-server/0.log" Mar 07 04:53:20 crc kubenswrapper[4689]: E0307 04:53:20.852532 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:53:20 crc kubenswrapper[4689]: E0307 04:53:20.852557 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:53:20 crc kubenswrapper[4689]: E0307 04:53:20.852616 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:55:22.852599511 +0000 UTC m=+2167.898983000 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:53:20 crc kubenswrapper[4689]: E0307 04:53:20.852638 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:55:22.852623451 +0000 UTC m=+2167.899006950 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:53:29 crc kubenswrapper[4689]: I0307 04:53:29.190126 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:53:29 crc kubenswrapper[4689]: I0307 04:53:29.190604 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.189646 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.190071 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.190107 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.190529 4689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb4b2124c937a1be1975e56b486e30c77111eb4ae794931d0065d01fdf7d1cc6"} pod="openshift-machine-config-operator/machine-config-daemon-dss5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.190583 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" containerID="cri-o://fb4b2124c937a1be1975e56b486e30c77111eb4ae794931d0065d01fdf7d1cc6" gracePeriod=600 Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.611037 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x7tsq"] Mar 07 04:53:59 crc kubenswrapper[4689]: E0307 04:53:59.611864 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6410a1f0-08d7-40d5-882d-508e8850a319" containerName="oc" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.611893 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="6410a1f0-08d7-40d5-882d-508e8850a319" containerName="oc" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.612121 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="6410a1f0-08d7-40d5-882d-508e8850a319" containerName="oc" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.613308 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.619541 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x7tsq"] Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.697398 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c61841c-fd07-4c84-8591-f01cbadc6d42-utilities\") pod \"redhat-operators-x7tsq\" (UID: \"1c61841c-fd07-4c84-8591-f01cbadc6d42\") " pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.697732 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c61841c-fd07-4c84-8591-f01cbadc6d42-catalog-content\") pod \"redhat-operators-x7tsq\" (UID: \"1c61841c-fd07-4c84-8591-f01cbadc6d42\") " pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.697783 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89wxq\" (UniqueName: \"kubernetes.io/projected/1c61841c-fd07-4c84-8591-f01cbadc6d42-kube-api-access-89wxq\") pod \"redhat-operators-x7tsq\" (UID: \"1c61841c-fd07-4c84-8591-f01cbadc6d42\") " pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.798796 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c61841c-fd07-4c84-8591-f01cbadc6d42-utilities\") pod \"redhat-operators-x7tsq\" (UID: \"1c61841c-fd07-4c84-8591-f01cbadc6d42\") " pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.798841 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c61841c-fd07-4c84-8591-f01cbadc6d42-catalog-content\") pod \"redhat-operators-x7tsq\" (UID: \"1c61841c-fd07-4c84-8591-f01cbadc6d42\") " pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.798892 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89wxq\" (UniqueName: \"kubernetes.io/projected/1c61841c-fd07-4c84-8591-f01cbadc6d42-kube-api-access-89wxq\") pod \"redhat-operators-x7tsq\" (UID: \"1c61841c-fd07-4c84-8591-f01cbadc6d42\") " pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.799605 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c61841c-fd07-4c84-8591-f01cbadc6d42-catalog-content\") pod \"redhat-operators-x7tsq\" (UID: \"1c61841c-fd07-4c84-8591-f01cbadc6d42\") " pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.799888 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c61841c-fd07-4c84-8591-f01cbadc6d42-utilities\") pod \"redhat-operators-x7tsq\" (UID: \"1c61841c-fd07-4c84-8591-f01cbadc6d42\") " pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.827426 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89wxq\" (UniqueName: \"kubernetes.io/projected/1c61841c-fd07-4c84-8591-f01cbadc6d42-kube-api-access-89wxq\") pod \"redhat-operators-x7tsq\" (UID: \"1c61841c-fd07-4c84-8591-f01cbadc6d42\") " pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:53:59 crc kubenswrapper[4689]: I0307 04:53:59.935876 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.144402 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547654-9svt9"] Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.145621 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547654-9svt9" Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.148584 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.148782 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.148788 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.154212 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547654-9svt9"] Mar 07 04:54:00 crc kubenswrapper[4689]: W0307 04:54:00.156108 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c61841c_fd07_4c84_8591_f01cbadc6d42.slice/crio-b90cf755ed3f37c9650919968be937d3b240d13ac450ebc68250c725b1029e6c WatchSource:0}: Error finding container b90cf755ed3f37c9650919968be937d3b240d13ac450ebc68250c725b1029e6c: Status 404 returned error can't find the container with id b90cf755ed3f37c9650919968be937d3b240d13ac450ebc68250c725b1029e6c Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.166084 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x7tsq"] Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.205214 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfct5\" (UniqueName: \"kubernetes.io/projected/e6c0ec17-3f11-4b38-a51a-d66ccebba90d-kube-api-access-gfct5\") pod \"auto-csr-approver-29547654-9svt9\" (UID: \"e6c0ec17-3f11-4b38-a51a-d66ccebba90d\") " pod="openshift-infra/auto-csr-approver-29547654-9svt9" Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.306685 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfct5\" (UniqueName: \"kubernetes.io/projected/e6c0ec17-3f11-4b38-a51a-d66ccebba90d-kube-api-access-gfct5\") pod \"auto-csr-approver-29547654-9svt9\" (UID: \"e6c0ec17-3f11-4b38-a51a-d66ccebba90d\") " pod="openshift-infra/auto-csr-approver-29547654-9svt9" Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.318823 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7tsq" event={"ID":"1c61841c-fd07-4c84-8591-f01cbadc6d42","Type":"ContainerStarted","Data":"b90cf755ed3f37c9650919968be937d3b240d13ac450ebc68250c725b1029e6c"} Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.321714 4689 generic.go:334] "Generic (PLEG): container finished" podID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerID="fb4b2124c937a1be1975e56b486e30c77111eb4ae794931d0065d01fdf7d1cc6" exitCode=0 Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.322349 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerDied","Data":"fb4b2124c937a1be1975e56b486e30c77111eb4ae794931d0065d01fdf7d1cc6"} Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.322383 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerStarted","Data":"a5e920e422f290f0adaacaa8069f13f665148bd13172616c95c274c51f3d032d"} Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.322403 4689 scope.go:117] "RemoveContainer" containerID="84850a0136eefb33c3de3307e4d7fccd5b4e6c66258f2b3bfc2eb182c4d0e536" Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.329887 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfct5\" (UniqueName: \"kubernetes.io/projected/e6c0ec17-3f11-4b38-a51a-d66ccebba90d-kube-api-access-gfct5\") pod \"auto-csr-approver-29547654-9svt9\" (UID: \"e6c0ec17-3f11-4b38-a51a-d66ccebba90d\") " pod="openshift-infra/auto-csr-approver-29547654-9svt9" Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.463746 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547654-9svt9" Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.671131 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547654-9svt9"] Mar 07 04:54:00 crc kubenswrapper[4689]: I0307 04:54:00.683561 4689 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 04:54:01 crc kubenswrapper[4689]: I0307 04:54:01.330474 4689 generic.go:334] "Generic (PLEG): container finished" podID="1c61841c-fd07-4c84-8591-f01cbadc6d42" containerID="f5042e587670300b48f3efb240fc386fb1d83cdd161ff45647e6ad267cb3cef8" exitCode=0 Mar 07 04:54:01 crc kubenswrapper[4689]: I0307 04:54:01.330524 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7tsq" event={"ID":"1c61841c-fd07-4c84-8591-f01cbadc6d42","Type":"ContainerDied","Data":"f5042e587670300b48f3efb240fc386fb1d83cdd161ff45647e6ad267cb3cef8"} Mar 07 04:54:01 crc kubenswrapper[4689]: I0307 04:54:01.342164 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547654-9svt9" event={"ID":"e6c0ec17-3f11-4b38-a51a-d66ccebba90d","Type":"ContainerStarted","Data":"8e4fa63e446d518257c26c710c451b82a30a37e0d2c3e858f2e4e0f833fd7871"} Mar 07 04:54:04 crc kubenswrapper[4689]: I0307 04:54:04.364508 4689 generic.go:334] "Generic (PLEG): container finished" podID="e6c0ec17-3f11-4b38-a51a-d66ccebba90d" containerID="abbfdaec5b2e169dc4cb8451abe7f97ba7d1333d1e84975efdc45472529c28b3" exitCode=0 Mar 07 04:54:04 crc kubenswrapper[4689]: I0307 04:54:04.364546 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547654-9svt9" event={"ID":"e6c0ec17-3f11-4b38-a51a-d66ccebba90d","Type":"ContainerDied","Data":"abbfdaec5b2e169dc4cb8451abe7f97ba7d1333d1e84975efdc45472529c28b3"} Mar 07 04:54:04 crc kubenswrapper[4689]: I0307 04:54:04.368071 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7tsq" event={"ID":"1c61841c-fd07-4c84-8591-f01cbadc6d42","Type":"ContainerStarted","Data":"1de922dcf46aa1d858a9eb1dad31ac552492c9f506280615f4f4b724b208712c"} Mar 07 04:54:05 crc kubenswrapper[4689]: I0307 04:54:05.376440 4689 generic.go:334] "Generic (PLEG): container finished" podID="1c61841c-fd07-4c84-8591-f01cbadc6d42" containerID="1de922dcf46aa1d858a9eb1dad31ac552492c9f506280615f4f4b724b208712c" exitCode=0 Mar 07 04:54:05 crc kubenswrapper[4689]: I0307 04:54:05.377237 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7tsq" event={"ID":"1c61841c-fd07-4c84-8591-f01cbadc6d42","Type":"ContainerDied","Data":"1de922dcf46aa1d858a9eb1dad31ac552492c9f506280615f4f4b724b208712c"} Mar 07 04:54:05 crc kubenswrapper[4689]: I0307 04:54:05.645977 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547654-9svt9" Mar 07 04:54:05 crc kubenswrapper[4689]: I0307 04:54:05.781132 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfct5\" (UniqueName: \"kubernetes.io/projected/e6c0ec17-3f11-4b38-a51a-d66ccebba90d-kube-api-access-gfct5\") pod \"e6c0ec17-3f11-4b38-a51a-d66ccebba90d\" (UID: \"e6c0ec17-3f11-4b38-a51a-d66ccebba90d\") " Mar 07 04:54:05 crc kubenswrapper[4689]: I0307 04:54:05.787455 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c0ec17-3f11-4b38-a51a-d66ccebba90d-kube-api-access-gfct5" (OuterVolumeSpecName: "kube-api-access-gfct5") pod "e6c0ec17-3f11-4b38-a51a-d66ccebba90d" (UID: "e6c0ec17-3f11-4b38-a51a-d66ccebba90d"). InnerVolumeSpecName "kube-api-access-gfct5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:54:05 crc kubenswrapper[4689]: I0307 04:54:05.883023 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfct5\" (UniqueName: \"kubernetes.io/projected/e6c0ec17-3f11-4b38-a51a-d66ccebba90d-kube-api-access-gfct5\") on node \"crc\" DevicePath \"\"" Mar 07 04:54:06 crc kubenswrapper[4689]: I0307 04:54:06.384756 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547654-9svt9" event={"ID":"e6c0ec17-3f11-4b38-a51a-d66ccebba90d","Type":"ContainerDied","Data":"8e4fa63e446d518257c26c710c451b82a30a37e0d2c3e858f2e4e0f833fd7871"} Mar 07 04:54:06 crc kubenswrapper[4689]: I0307 04:54:06.386205 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e4fa63e446d518257c26c710c451b82a30a37e0d2c3e858f2e4e0f833fd7871" Mar 07 04:54:06 crc kubenswrapper[4689]: I0307 04:54:06.384772 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547654-9svt9" Mar 07 04:54:06 crc kubenswrapper[4689]: I0307 04:54:06.387567 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7tsq" event={"ID":"1c61841c-fd07-4c84-8591-f01cbadc6d42","Type":"ContainerStarted","Data":"ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff"} Mar 07 04:54:06 crc kubenswrapper[4689]: I0307 04:54:06.405407 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x7tsq" podStartSLOduration=2.857188596 podStartE2EDuration="7.405392658s" podCreationTimestamp="2026-03-07 04:53:59 +0000 UTC" firstStartedPulling="2026-03-07 04:54:01.332325239 +0000 UTC m=+2086.378708738" lastFinishedPulling="2026-03-07 04:54:05.880529311 +0000 UTC m=+2090.926912800" observedRunningTime="2026-03-07 04:54:06.404138804 +0000 UTC m=+2091.450522303" watchObservedRunningTime="2026-03-07 04:54:06.405392658 +0000 UTC m=+2091.451776147" Mar 07 04:54:06 crc kubenswrapper[4689]: I0307 04:54:06.703753 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547648-qjvvn"] Mar 07 04:54:06 crc kubenswrapper[4689]: I0307 04:54:06.709960 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547648-qjvvn"] Mar 07 04:54:07 crc kubenswrapper[4689]: I0307 04:54:07.843678 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81670773-b472-461f-8b12-7d589c3442e6" path="/var/lib/kubelet/pods/81670773-b472-461f-8b12-7d589c3442e6/volumes" Mar 07 04:54:09 crc kubenswrapper[4689]: I0307 04:54:09.936871 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:54:09 crc kubenswrapper[4689]: I0307 04:54:09.937466 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:54:10 crc kubenswrapper[4689]: I0307 04:54:10.990550 4689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x7tsq" podUID="1c61841c-fd07-4c84-8591-f01cbadc6d42" containerName="registry-server" probeResult="failure" output=< Mar 07 04:54:10 crc kubenswrapper[4689]: timeout: failed to connect service ":50051" within 1s Mar 07 04:54:10 crc kubenswrapper[4689]: > Mar 07 04:54:20 crc kubenswrapper[4689]: I0307 04:54:20.010493 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:54:20 crc kubenswrapper[4689]: I0307 04:54:20.081905 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:54:21 crc kubenswrapper[4689]: I0307 04:54:21.450685 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x7tsq"] Mar 07 04:54:21 crc kubenswrapper[4689]: I0307 04:54:21.508801 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x7tsq" podUID="1c61841c-fd07-4c84-8591-f01cbadc6d42" containerName="registry-server" containerID="cri-o://ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff" gracePeriod=2 Mar 07 04:54:21 crc kubenswrapper[4689]: I0307 04:54:21.926015 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.111259 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89wxq\" (UniqueName: \"kubernetes.io/projected/1c61841c-fd07-4c84-8591-f01cbadc6d42-kube-api-access-89wxq\") pod \"1c61841c-fd07-4c84-8591-f01cbadc6d42\" (UID: \"1c61841c-fd07-4c84-8591-f01cbadc6d42\") " Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.111374 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c61841c-fd07-4c84-8591-f01cbadc6d42-utilities\") pod \"1c61841c-fd07-4c84-8591-f01cbadc6d42\" (UID: \"1c61841c-fd07-4c84-8591-f01cbadc6d42\") " Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.111501 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c61841c-fd07-4c84-8591-f01cbadc6d42-catalog-content\") pod \"1c61841c-fd07-4c84-8591-f01cbadc6d42\" (UID: \"1c61841c-fd07-4c84-8591-f01cbadc6d42\") " Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.112421 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c61841c-fd07-4c84-8591-f01cbadc6d42-utilities" (OuterVolumeSpecName: "utilities") pod "1c61841c-fd07-4c84-8591-f01cbadc6d42" (UID: "1c61841c-fd07-4c84-8591-f01cbadc6d42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.119971 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c61841c-fd07-4c84-8591-f01cbadc6d42-kube-api-access-89wxq" (OuterVolumeSpecName: "kube-api-access-89wxq") pod "1c61841c-fd07-4c84-8591-f01cbadc6d42" (UID: "1c61841c-fd07-4c84-8591-f01cbadc6d42"). InnerVolumeSpecName "kube-api-access-89wxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.213494 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89wxq\" (UniqueName: \"kubernetes.io/projected/1c61841c-fd07-4c84-8591-f01cbadc6d42-kube-api-access-89wxq\") on node \"crc\" DevicePath \"\"" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.213540 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c61841c-fd07-4c84-8591-f01cbadc6d42-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.227684 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c61841c-fd07-4c84-8591-f01cbadc6d42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c61841c-fd07-4c84-8591-f01cbadc6d42" (UID: "1c61841c-fd07-4c84-8591-f01cbadc6d42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.315152 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c61841c-fd07-4c84-8591-f01cbadc6d42-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.523947 4689 generic.go:334] "Generic (PLEG): container finished" podID="1c61841c-fd07-4c84-8591-f01cbadc6d42" containerID="ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff" exitCode=0 Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.524106 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7tsq" event={"ID":"1c61841c-fd07-4c84-8591-f01cbadc6d42","Type":"ContainerDied","Data":"ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff"} Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.524151 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7tsq" event={"ID":"1c61841c-fd07-4c84-8591-f01cbadc6d42","Type":"ContainerDied","Data":"b90cf755ed3f37c9650919968be937d3b240d13ac450ebc68250c725b1029e6c"} Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.524225 4689 scope.go:117] "RemoveContainer" containerID="ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.524529 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7tsq" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.558040 4689 scope.go:117] "RemoveContainer" containerID="1de922dcf46aa1d858a9eb1dad31ac552492c9f506280615f4f4b724b208712c" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.596854 4689 scope.go:117] "RemoveContainer" containerID="f5042e587670300b48f3efb240fc386fb1d83cdd161ff45647e6ad267cb3cef8" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.596982 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x7tsq"] Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.602832 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x7tsq"] Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.635323 4689 scope.go:117] "RemoveContainer" containerID="ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff" Mar 07 04:54:22 crc kubenswrapper[4689]: E0307 04:54:22.635825 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff\": container with ID starting with ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff not found: ID does not exist" containerID="ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.635868 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff"} err="failed to get container status \"ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff\": rpc error: code = NotFound desc = could not find container \"ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff\": container with ID starting with ead31be9e64ccd4792f88d5cf2b4c965e186a25daa6112179418fbc1913cb2ff not found: ID does not exist" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.635893 4689 scope.go:117] "RemoveContainer" containerID="1de922dcf46aa1d858a9eb1dad31ac552492c9f506280615f4f4b724b208712c" Mar 07 04:54:22 crc kubenswrapper[4689]: E0307 04:54:22.636572 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de922dcf46aa1d858a9eb1dad31ac552492c9f506280615f4f4b724b208712c\": container with ID starting with 1de922dcf46aa1d858a9eb1dad31ac552492c9f506280615f4f4b724b208712c not found: ID does not exist" containerID="1de922dcf46aa1d858a9eb1dad31ac552492c9f506280615f4f4b724b208712c" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.636624 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de922dcf46aa1d858a9eb1dad31ac552492c9f506280615f4f4b724b208712c"} err="failed to get container status \"1de922dcf46aa1d858a9eb1dad31ac552492c9f506280615f4f4b724b208712c\": rpc error: code = NotFound desc = could not find container \"1de922dcf46aa1d858a9eb1dad31ac552492c9f506280615f4f4b724b208712c\": container with ID starting with 1de922dcf46aa1d858a9eb1dad31ac552492c9f506280615f4f4b724b208712c not found: ID does not exist" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.636715 4689 scope.go:117] "RemoveContainer" containerID="f5042e587670300b48f3efb240fc386fb1d83cdd161ff45647e6ad267cb3cef8" Mar 07 04:54:22 crc kubenswrapper[4689]: E0307 04:54:22.637211 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5042e587670300b48f3efb240fc386fb1d83cdd161ff45647e6ad267cb3cef8\": container with ID starting with f5042e587670300b48f3efb240fc386fb1d83cdd161ff45647e6ad267cb3cef8 not found: ID does not exist" containerID="f5042e587670300b48f3efb240fc386fb1d83cdd161ff45647e6ad267cb3cef8" Mar 07 04:54:22 crc kubenswrapper[4689]: I0307 04:54:22.637251 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5042e587670300b48f3efb240fc386fb1d83cdd161ff45647e6ad267cb3cef8"} err="failed to get container status \"f5042e587670300b48f3efb240fc386fb1d83cdd161ff45647e6ad267cb3cef8\": rpc error: code = NotFound desc = could not find container \"f5042e587670300b48f3efb240fc386fb1d83cdd161ff45647e6ad267cb3cef8\": container with ID starting with f5042e587670300b48f3efb240fc386fb1d83cdd161ff45647e6ad267cb3cef8 not found: ID does not exist" Mar 07 04:54:23 crc kubenswrapper[4689]: I0307 04:54:23.835240 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c61841c-fd07-4c84-8591-f01cbadc6d42" path="/var/lib/kubelet/pods/1c61841c-fd07-4c84-8591-f01cbadc6d42/volumes" Mar 07 04:54:29 crc kubenswrapper[4689]: I0307 04:54:29.589313 4689 generic.go:334] "Generic (PLEG): container finished" podID="8d940c90-cfda-4c9e-9066-a5258ce6604c" containerID="f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba" exitCode=0 Mar 07 04:54:29 crc kubenswrapper[4689]: I0307 04:54:29.589428 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-97nfh/must-gather-4mw4v" event={"ID":"8d940c90-cfda-4c9e-9066-a5258ce6604c","Type":"ContainerDied","Data":"f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba"} Mar 07 04:54:29 crc kubenswrapper[4689]: I0307 04:54:29.590905 4689 scope.go:117] "RemoveContainer" containerID="f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba" Mar 07 04:54:30 crc kubenswrapper[4689]: I0307 04:54:30.292958 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-97nfh_must-gather-4mw4v_8d940c90-cfda-4c9e-9066-a5258ce6604c/gather/0.log" Mar 07 04:54:38 crc kubenswrapper[4689]: I0307 04:54:38.953811 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-97nfh/must-gather-4mw4v"] Mar 07 04:54:38 crc kubenswrapper[4689]: I0307 04:54:38.954994 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-97nfh/must-gather-4mw4v" podUID="8d940c90-cfda-4c9e-9066-a5258ce6604c" containerName="copy" containerID="cri-o://97863b38f9f39863f8c104726314846964f876b978e0ba93c1e176a40282dd36" gracePeriod=2 Mar 07 04:54:38 crc kubenswrapper[4689]: I0307 04:54:38.966898 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-97nfh/must-gather-4mw4v"] Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.376335 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-97nfh_must-gather-4mw4v_8d940c90-cfda-4c9e-9066-a5258ce6604c/copy/0.log" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.377589 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-97nfh/must-gather-4mw4v" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.485649 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgbzg\" (UniqueName: \"kubernetes.io/projected/8d940c90-cfda-4c9e-9066-a5258ce6604c-kube-api-access-hgbzg\") pod \"8d940c90-cfda-4c9e-9066-a5258ce6604c\" (UID: \"8d940c90-cfda-4c9e-9066-a5258ce6604c\") " Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.485875 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d940c90-cfda-4c9e-9066-a5258ce6604c-must-gather-output\") pod \"8d940c90-cfda-4c9e-9066-a5258ce6604c\" (UID: \"8d940c90-cfda-4c9e-9066-a5258ce6604c\") " Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.499499 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d940c90-cfda-4c9e-9066-a5258ce6604c-kube-api-access-hgbzg" (OuterVolumeSpecName: "kube-api-access-hgbzg") pod "8d940c90-cfda-4c9e-9066-a5258ce6604c" (UID: "8d940c90-cfda-4c9e-9066-a5258ce6604c"). InnerVolumeSpecName "kube-api-access-hgbzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.564219 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d940c90-cfda-4c9e-9066-a5258ce6604c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8d940c90-cfda-4c9e-9066-a5258ce6604c" (UID: "8d940c90-cfda-4c9e-9066-a5258ce6604c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.587133 4689 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d940c90-cfda-4c9e-9066-a5258ce6604c-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.587235 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgbzg\" (UniqueName: \"kubernetes.io/projected/8d940c90-cfda-4c9e-9066-a5258ce6604c-kube-api-access-hgbzg\") on node \"crc\" DevicePath \"\"" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.661597 4689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-97nfh_must-gather-4mw4v_8d940c90-cfda-4c9e-9066-a5258ce6604c/copy/0.log" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.661995 4689 generic.go:334] "Generic (PLEG): container finished" podID="8d940c90-cfda-4c9e-9066-a5258ce6604c" containerID="97863b38f9f39863f8c104726314846964f876b978e0ba93c1e176a40282dd36" exitCode=143 Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.662038 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-97nfh/must-gather-4mw4v" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.662058 4689 scope.go:117] "RemoveContainer" containerID="97863b38f9f39863f8c104726314846964f876b978e0ba93c1e176a40282dd36" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.677844 4689 scope.go:117] "RemoveContainer" containerID="f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.735293 4689 scope.go:117] "RemoveContainer" containerID="97863b38f9f39863f8c104726314846964f876b978e0ba93c1e176a40282dd36" Mar 07 04:54:39 crc kubenswrapper[4689]: E0307 04:54:39.736001 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97863b38f9f39863f8c104726314846964f876b978e0ba93c1e176a40282dd36\": container with ID starting with 97863b38f9f39863f8c104726314846964f876b978e0ba93c1e176a40282dd36 not found: ID does not exist" containerID="97863b38f9f39863f8c104726314846964f876b978e0ba93c1e176a40282dd36" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.736037 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97863b38f9f39863f8c104726314846964f876b978e0ba93c1e176a40282dd36"} err="failed to get container status \"97863b38f9f39863f8c104726314846964f876b978e0ba93c1e176a40282dd36\": rpc error: code = NotFound desc = could not find container \"97863b38f9f39863f8c104726314846964f876b978e0ba93c1e176a40282dd36\": container with ID starting with 97863b38f9f39863f8c104726314846964f876b978e0ba93c1e176a40282dd36 not found: ID does not exist" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.736061 4689 scope.go:117] "RemoveContainer" containerID="f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba" Mar 07 04:54:39 crc kubenswrapper[4689]: E0307 04:54:39.736449 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba\": container with ID starting with f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba not found: ID does not exist" containerID="f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.736471 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba"} err="failed to get container status \"f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba\": rpc error: code = NotFound desc = could not find container \"f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba\": container with ID starting with f75aa7646913c18220e28d22276c3c1dc09dd5ba8418efb36a8eb3c9083b40ba not found: ID does not exist" Mar 07 04:54:39 crc kubenswrapper[4689]: I0307 04:54:39.834200 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d940c90-cfda-4c9e-9066-a5258ce6604c" path="/var/lib/kubelet/pods/8d940c90-cfda-4c9e-9066-a5258ce6604c/volumes" Mar 07 04:54:54 crc kubenswrapper[4689]: I0307 04:54:54.301991 4689 scope.go:117] "RemoveContainer" containerID="ec8c07d09e7aa473f3218f9b8d32c92942d5ba8b6a143fd180e658cfefff9de5" Mar 07 04:55:22 crc kubenswrapper[4689]: E0307 04:55:22.920849 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:55:22 crc kubenswrapper[4689]: E0307 04:55:22.921659 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:57:24.921634766 +0000 UTC m=+2289.968018265 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:55:22 crc kubenswrapper[4689]: E0307 04:55:22.921356 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:55:22 crc kubenswrapper[4689]: E0307 04:55:22.921761 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:57:24.921737039 +0000 UTC m=+2289.968120538 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:55:59 crc kubenswrapper[4689]: I0307 04:55:59.189528 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:55:59 crc kubenswrapper[4689]: I0307 04:55:59.190490 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.154689 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547656-qxgxb"] Mar 07 04:56:00 crc kubenswrapper[4689]: E0307 04:56:00.154954 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d940c90-cfda-4c9e-9066-a5258ce6604c" containerName="gather" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.154969 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d940c90-cfda-4c9e-9066-a5258ce6604c" containerName="gather" Mar 07 04:56:00 crc kubenswrapper[4689]: E0307 04:56:00.154978 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c61841c-fd07-4c84-8591-f01cbadc6d42" containerName="registry-server" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.154987 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c61841c-fd07-4c84-8591-f01cbadc6d42" containerName="registry-server" Mar 07 04:56:00 crc kubenswrapper[4689]: E0307 04:56:00.155010 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c0ec17-3f11-4b38-a51a-d66ccebba90d" containerName="oc" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.155019 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c0ec17-3f11-4b38-a51a-d66ccebba90d" containerName="oc" Mar 07 04:56:00 crc kubenswrapper[4689]: E0307 04:56:00.155033 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c61841c-fd07-4c84-8591-f01cbadc6d42" containerName="extract-utilities" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.155040 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c61841c-fd07-4c84-8591-f01cbadc6d42" containerName="extract-utilities" Mar 07 04:56:00 crc kubenswrapper[4689]: E0307 04:56:00.155061 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c61841c-fd07-4c84-8591-f01cbadc6d42" containerName="extract-content" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.155068 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c61841c-fd07-4c84-8591-f01cbadc6d42" containerName="extract-content" Mar 07 04:56:00 crc kubenswrapper[4689]: E0307 04:56:00.155078 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d940c90-cfda-4c9e-9066-a5258ce6604c" containerName="copy" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.155084 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d940c90-cfda-4c9e-9066-a5258ce6604c" containerName="copy" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.155239 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d940c90-cfda-4c9e-9066-a5258ce6604c" containerName="copy" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.155262 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c0ec17-3f11-4b38-a51a-d66ccebba90d" containerName="oc" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.155275 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c61841c-fd07-4c84-8591-f01cbadc6d42" containerName="registry-server" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.155283 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d940c90-cfda-4c9e-9066-a5258ce6604c" containerName="gather" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.155720 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547656-qxgxb" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.160065 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.160283 4689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.162084 4689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8r5ws" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.185765 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547656-qxgxb"] Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.284685 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltwsx\" (UniqueName: \"kubernetes.io/projected/afb9af32-2286-4f72-a00c-41ec8ebe97b1-kube-api-access-ltwsx\") pod \"auto-csr-approver-29547656-qxgxb\" (UID: \"afb9af32-2286-4f72-a00c-41ec8ebe97b1\") " pod="openshift-infra/auto-csr-approver-29547656-qxgxb" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.386275 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltwsx\" (UniqueName: \"kubernetes.io/projected/afb9af32-2286-4f72-a00c-41ec8ebe97b1-kube-api-access-ltwsx\") pod \"auto-csr-approver-29547656-qxgxb\" (UID: \"afb9af32-2286-4f72-a00c-41ec8ebe97b1\") " pod="openshift-infra/auto-csr-approver-29547656-qxgxb" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.408059 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltwsx\" (UniqueName: \"kubernetes.io/projected/afb9af32-2286-4f72-a00c-41ec8ebe97b1-kube-api-access-ltwsx\") pod \"auto-csr-approver-29547656-qxgxb\" (UID: \"afb9af32-2286-4f72-a00c-41ec8ebe97b1\") " pod="openshift-infra/auto-csr-approver-29547656-qxgxb" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.487881 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547656-qxgxb" Mar 07 04:56:00 crc kubenswrapper[4689]: I0307 04:56:00.768969 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547656-qxgxb"] Mar 07 04:56:00 crc kubenswrapper[4689]: W0307 04:56:00.782960 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafb9af32_2286_4f72_a00c_41ec8ebe97b1.slice/crio-e9c5515ec4fdd425b561e1eda96c9078811bde9f1b5e4d882258a951b01ff037 WatchSource:0}: Error finding container e9c5515ec4fdd425b561e1eda96c9078811bde9f1b5e4d882258a951b01ff037: Status 404 returned error can't find the container with id e9c5515ec4fdd425b561e1eda96c9078811bde9f1b5e4d882258a951b01ff037 Mar 07 04:56:01 crc kubenswrapper[4689]: I0307 04:56:01.239215 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547656-qxgxb" event={"ID":"afb9af32-2286-4f72-a00c-41ec8ebe97b1","Type":"ContainerStarted","Data":"e9c5515ec4fdd425b561e1eda96c9078811bde9f1b5e4d882258a951b01ff037"} Mar 07 04:56:02 crc kubenswrapper[4689]: I0307 04:56:02.246852 4689 generic.go:334] "Generic (PLEG): container finished" podID="afb9af32-2286-4f72-a00c-41ec8ebe97b1" containerID="e9b810547a9fa6272ab63dd420f9b258fbea42b5fe1ae31addbbf43726934b3e" exitCode=0 Mar 07 04:56:02 crc kubenswrapper[4689]: I0307 04:56:02.246915 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547656-qxgxb" event={"ID":"afb9af32-2286-4f72-a00c-41ec8ebe97b1","Type":"ContainerDied","Data":"e9b810547a9fa6272ab63dd420f9b258fbea42b5fe1ae31addbbf43726934b3e"} Mar 07 04:56:03 crc kubenswrapper[4689]: I0307 04:56:03.634762 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547656-qxgxb" Mar 07 04:56:03 crc kubenswrapper[4689]: I0307 04:56:03.733517 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltwsx\" (UniqueName: \"kubernetes.io/projected/afb9af32-2286-4f72-a00c-41ec8ebe97b1-kube-api-access-ltwsx\") pod \"afb9af32-2286-4f72-a00c-41ec8ebe97b1\" (UID: \"afb9af32-2286-4f72-a00c-41ec8ebe97b1\") " Mar 07 04:56:03 crc kubenswrapper[4689]: I0307 04:56:03.740224 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb9af32-2286-4f72-a00c-41ec8ebe97b1-kube-api-access-ltwsx" (OuterVolumeSpecName: "kube-api-access-ltwsx") pod "afb9af32-2286-4f72-a00c-41ec8ebe97b1" (UID: "afb9af32-2286-4f72-a00c-41ec8ebe97b1"). InnerVolumeSpecName "kube-api-access-ltwsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:56:03 crc kubenswrapper[4689]: I0307 04:56:03.838350 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltwsx\" (UniqueName: \"kubernetes.io/projected/afb9af32-2286-4f72-a00c-41ec8ebe97b1-kube-api-access-ltwsx\") on node \"crc\" DevicePath \"\"" Mar 07 04:56:04 crc kubenswrapper[4689]: I0307 04:56:04.265873 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547656-qxgxb" event={"ID":"afb9af32-2286-4f72-a00c-41ec8ebe97b1","Type":"ContainerDied","Data":"e9c5515ec4fdd425b561e1eda96c9078811bde9f1b5e4d882258a951b01ff037"} Mar 07 04:56:04 crc kubenswrapper[4689]: I0307 04:56:04.265936 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547656-qxgxb" Mar 07 04:56:04 crc kubenswrapper[4689]: I0307 04:56:04.265944 4689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9c5515ec4fdd425b561e1eda96c9078811bde9f1b5e4d882258a951b01ff037" Mar 07 04:56:04 crc kubenswrapper[4689]: I0307 04:56:04.706096 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547650-96hvh"] Mar 07 04:56:04 crc kubenswrapper[4689]: I0307 04:56:04.709220 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547650-96hvh"] Mar 07 04:56:05 crc kubenswrapper[4689]: I0307 04:56:05.841881 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c4a9dd-74e6-415a-9d04-783a003dd6e7" path="/var/lib/kubelet/pods/d8c4a9dd-74e6-415a-9d04-783a003dd6e7/volumes" Mar 07 04:56:29 crc kubenswrapper[4689]: I0307 04:56:29.189819 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:56:29 crc kubenswrapper[4689]: I0307 04:56:29.191379 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.745199 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7kgdr"] Mar 07 04:56:43 crc kubenswrapper[4689]: E0307 04:56:43.746086 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb9af32-2286-4f72-a00c-41ec8ebe97b1" containerName="oc" Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.746104 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb9af32-2286-4f72-a00c-41ec8ebe97b1" containerName="oc" Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.746273 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb9af32-2286-4f72-a00c-41ec8ebe97b1" containerName="oc" Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.747286 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.751871 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7kgdr"] Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.894958 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddb6db14-07de-4191-9e3c-df8642dbd70a-catalog-content\") pod \"community-operators-7kgdr\" (UID: \"ddb6db14-07de-4191-9e3c-df8642dbd70a\") " pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.895345 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t5gl\" (UniqueName: \"kubernetes.io/projected/ddb6db14-07de-4191-9e3c-df8642dbd70a-kube-api-access-4t5gl\") pod \"community-operators-7kgdr\" (UID: \"ddb6db14-07de-4191-9e3c-df8642dbd70a\") " pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.895386 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddb6db14-07de-4191-9e3c-df8642dbd70a-utilities\") pod \"community-operators-7kgdr\" (UID: \"ddb6db14-07de-4191-9e3c-df8642dbd70a\") " pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.996864 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddb6db14-07de-4191-9e3c-df8642dbd70a-utilities\") pod \"community-operators-7kgdr\" (UID: \"ddb6db14-07de-4191-9e3c-df8642dbd70a\") " pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.996942 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddb6db14-07de-4191-9e3c-df8642dbd70a-catalog-content\") pod \"community-operators-7kgdr\" (UID: \"ddb6db14-07de-4191-9e3c-df8642dbd70a\") " pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.997019 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t5gl\" (UniqueName: \"kubernetes.io/projected/ddb6db14-07de-4191-9e3c-df8642dbd70a-kube-api-access-4t5gl\") pod \"community-operators-7kgdr\" (UID: \"ddb6db14-07de-4191-9e3c-df8642dbd70a\") " pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.997417 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddb6db14-07de-4191-9e3c-df8642dbd70a-utilities\") pod \"community-operators-7kgdr\" (UID: \"ddb6db14-07de-4191-9e3c-df8642dbd70a\") " pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:43 crc kubenswrapper[4689]: I0307 04:56:43.997472 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddb6db14-07de-4191-9e3c-df8642dbd70a-catalog-content\") pod \"community-operators-7kgdr\" (UID: \"ddb6db14-07de-4191-9e3c-df8642dbd70a\") " pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:44 crc kubenswrapper[4689]: I0307 04:56:44.018258 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t5gl\" (UniqueName: \"kubernetes.io/projected/ddb6db14-07de-4191-9e3c-df8642dbd70a-kube-api-access-4t5gl\") pod \"community-operators-7kgdr\" (UID: \"ddb6db14-07de-4191-9e3c-df8642dbd70a\") " pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:44 crc kubenswrapper[4689]: I0307 04:56:44.084055 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:44 crc kubenswrapper[4689]: I0307 04:56:44.353833 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7kgdr"] Mar 07 04:56:44 crc kubenswrapper[4689]: I0307 04:56:44.557678 4689 generic.go:334] "Generic (PLEG): container finished" podID="ddb6db14-07de-4191-9e3c-df8642dbd70a" containerID="4c680f4199e3351dbd361df944d6c246ad83153b806d83d6c03f48da31b3c887" exitCode=0 Mar 07 04:56:44 crc kubenswrapper[4689]: I0307 04:56:44.557715 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kgdr" event={"ID":"ddb6db14-07de-4191-9e3c-df8642dbd70a","Type":"ContainerDied","Data":"4c680f4199e3351dbd361df944d6c246ad83153b806d83d6c03f48da31b3c887"} Mar 07 04:56:44 crc kubenswrapper[4689]: I0307 04:56:44.557740 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kgdr" event={"ID":"ddb6db14-07de-4191-9e3c-df8642dbd70a","Type":"ContainerStarted","Data":"f9b071b24a8e383bb10d29c17377478e51fcab2ff388c520186540bd420fcd0d"} Mar 07 04:56:45 crc kubenswrapper[4689]: I0307 04:56:45.565317 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kgdr" event={"ID":"ddb6db14-07de-4191-9e3c-df8642dbd70a","Type":"ContainerStarted","Data":"95b3d517eeb1188ccdcb57cc328b31619bed96449eae9bfd87bd8c09ec6d1d8a"} Mar 07 04:56:46 crc kubenswrapper[4689]: I0307 04:56:46.577448 4689 generic.go:334] "Generic (PLEG): container finished" podID="ddb6db14-07de-4191-9e3c-df8642dbd70a" containerID="95b3d517eeb1188ccdcb57cc328b31619bed96449eae9bfd87bd8c09ec6d1d8a" exitCode=0 Mar 07 04:56:46 crc kubenswrapper[4689]: I0307 04:56:46.577519 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kgdr" event={"ID":"ddb6db14-07de-4191-9e3c-df8642dbd70a","Type":"ContainerDied","Data":"95b3d517eeb1188ccdcb57cc328b31619bed96449eae9bfd87bd8c09ec6d1d8a"} Mar 07 04:56:46 crc kubenswrapper[4689]: I0307 04:56:46.926072 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pzmrx"] Mar 07 04:56:46 crc kubenswrapper[4689]: I0307 04:56:46.927589 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:46 crc kubenswrapper[4689]: I0307 04:56:46.944548 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzmrx"] Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.038868 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvvpn\" (UniqueName: \"kubernetes.io/projected/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-kube-api-access-kvvpn\") pod \"redhat-marketplace-pzmrx\" (UID: \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\") " pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.039301 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-catalog-content\") pod \"redhat-marketplace-pzmrx\" (UID: \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\") " pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.039352 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-utilities\") pod \"redhat-marketplace-pzmrx\" (UID: \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\") " pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.140304 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvvpn\" (UniqueName: \"kubernetes.io/projected/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-kube-api-access-kvvpn\") pod \"redhat-marketplace-pzmrx\" (UID: \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\") " pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.140477 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-catalog-content\") pod \"redhat-marketplace-pzmrx\" (UID: \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\") " pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.140511 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-utilities\") pod \"redhat-marketplace-pzmrx\" (UID: \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\") " pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.141055 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-utilities\") pod \"redhat-marketplace-pzmrx\" (UID: \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\") " pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.141241 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-catalog-content\") pod \"redhat-marketplace-pzmrx\" (UID: \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\") " pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.179388 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvvpn\" (UniqueName: \"kubernetes.io/projected/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-kube-api-access-kvvpn\") pod \"redhat-marketplace-pzmrx\" (UID: \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\") " pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.252709 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.477650 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzmrx"] Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.586014 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kgdr" event={"ID":"ddb6db14-07de-4191-9e3c-df8642dbd70a","Type":"ContainerStarted","Data":"91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435"} Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.587831 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzmrx" event={"ID":"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9","Type":"ContainerStarted","Data":"d8bbb9ebef15135af1be573298dd099b708139b8185ce029561247dc4f18049a"} Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.587857 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzmrx" event={"ID":"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9","Type":"ContainerStarted","Data":"891504d2d03967b19e8f836f8c6e6a75883ef97407beb69c94ba115f43243f0f"} Mar 07 04:56:47 crc kubenswrapper[4689]: I0307 04:56:47.607849 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7kgdr" podStartSLOduration=2.198126715 podStartE2EDuration="4.607834497s" podCreationTimestamp="2026-03-07 04:56:43 +0000 UTC" firstStartedPulling="2026-03-07 04:56:44.559198339 +0000 UTC m=+2249.605581828" lastFinishedPulling="2026-03-07 04:56:46.968906101 +0000 UTC m=+2252.015289610" observedRunningTime="2026-03-07 04:56:47.60535751 +0000 UTC m=+2252.651740999" watchObservedRunningTime="2026-03-07 04:56:47.607834497 +0000 UTC m=+2252.654217986" Mar 07 04:56:48 crc kubenswrapper[4689]: I0307 04:56:48.597363 4689 generic.go:334] "Generic (PLEG): container finished" podID="d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" containerID="d8bbb9ebef15135af1be573298dd099b708139b8185ce029561247dc4f18049a" exitCode=0 Mar 07 04:56:48 crc kubenswrapper[4689]: I0307 04:56:48.597477 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzmrx" event={"ID":"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9","Type":"ContainerDied","Data":"d8bbb9ebef15135af1be573298dd099b708139b8185ce029561247dc4f18049a"} Mar 07 04:56:49 crc kubenswrapper[4689]: I0307 04:56:49.607893 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzmrx" event={"ID":"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9","Type":"ContainerStarted","Data":"253036035be2b95b1ef0508ca90c1332d3d049897178e4cbfc5116aa80a5334f"} Mar 07 04:56:50 crc kubenswrapper[4689]: I0307 04:56:50.618099 4689 generic.go:334] "Generic (PLEG): container finished" podID="d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" containerID="253036035be2b95b1ef0508ca90c1332d3d049897178e4cbfc5116aa80a5334f" exitCode=0 Mar 07 04:56:50 crc kubenswrapper[4689]: I0307 04:56:50.618199 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzmrx" event={"ID":"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9","Type":"ContainerDied","Data":"253036035be2b95b1ef0508ca90c1332d3d049897178e4cbfc5116aa80a5334f"} Mar 07 04:56:51 crc kubenswrapper[4689]: I0307 04:56:51.627878 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzmrx" event={"ID":"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9","Type":"ContainerStarted","Data":"9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad"} Mar 07 04:56:51 crc kubenswrapper[4689]: I0307 04:56:51.650413 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pzmrx" podStartSLOduration=3.191873775 podStartE2EDuration="5.650389087s" podCreationTimestamp="2026-03-07 04:56:46 +0000 UTC" firstStartedPulling="2026-03-07 04:56:48.599507329 +0000 UTC m=+2253.645890818" lastFinishedPulling="2026-03-07 04:56:51.058022611 +0000 UTC m=+2256.104406130" observedRunningTime="2026-03-07 04:56:51.645730861 +0000 UTC m=+2256.692114430" watchObservedRunningTime="2026-03-07 04:56:51.650389087 +0000 UTC m=+2256.696772616" Mar 07 04:56:54 crc kubenswrapper[4689]: I0307 04:56:54.084433 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:54 crc kubenswrapper[4689]: I0307 04:56:54.084496 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:54 crc kubenswrapper[4689]: I0307 04:56:54.157157 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:54 crc kubenswrapper[4689]: I0307 04:56:54.387790 4689 scope.go:117] "RemoveContainer" containerID="ed3222efb5f143fd87f78213a58a7c30dc8d4c5b626246d71c6e9ea2d50ef439" Mar 07 04:56:54 crc kubenswrapper[4689]: I0307 04:56:54.717471 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:55 crc kubenswrapper[4689]: I0307 04:56:55.322531 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7kgdr"] Mar 07 04:56:56 crc kubenswrapper[4689]: I0307 04:56:56.663576 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7kgdr" podUID="ddb6db14-07de-4191-9e3c-df8642dbd70a" containerName="registry-server" containerID="cri-o://91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435" gracePeriod=2 Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.036027 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.109083 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddb6db14-07de-4191-9e3c-df8642dbd70a-catalog-content\") pod \"ddb6db14-07de-4191-9e3c-df8642dbd70a\" (UID: \"ddb6db14-07de-4191-9e3c-df8642dbd70a\") " Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.109153 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t5gl\" (UniqueName: \"kubernetes.io/projected/ddb6db14-07de-4191-9e3c-df8642dbd70a-kube-api-access-4t5gl\") pod \"ddb6db14-07de-4191-9e3c-df8642dbd70a\" (UID: \"ddb6db14-07de-4191-9e3c-df8642dbd70a\") " Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.109374 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddb6db14-07de-4191-9e3c-df8642dbd70a-utilities\") pod \"ddb6db14-07de-4191-9e3c-df8642dbd70a\" (UID: \"ddb6db14-07de-4191-9e3c-df8642dbd70a\") " Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.110545 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddb6db14-07de-4191-9e3c-df8642dbd70a-utilities" (OuterVolumeSpecName: "utilities") pod "ddb6db14-07de-4191-9e3c-df8642dbd70a" (UID: "ddb6db14-07de-4191-9e3c-df8642dbd70a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.116479 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddb6db14-07de-4191-9e3c-df8642dbd70a-kube-api-access-4t5gl" (OuterVolumeSpecName: "kube-api-access-4t5gl") pod "ddb6db14-07de-4191-9e3c-df8642dbd70a" (UID: "ddb6db14-07de-4191-9e3c-df8642dbd70a"). InnerVolumeSpecName "kube-api-access-4t5gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.165689 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddb6db14-07de-4191-9e3c-df8642dbd70a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ddb6db14-07de-4191-9e3c-df8642dbd70a" (UID: "ddb6db14-07de-4191-9e3c-df8642dbd70a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.211377 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddb6db14-07de-4191-9e3c-df8642dbd70a-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.211404 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddb6db14-07de-4191-9e3c-df8642dbd70a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.211433 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t5gl\" (UniqueName: \"kubernetes.io/projected/ddb6db14-07de-4191-9e3c-df8642dbd70a-kube-api-access-4t5gl\") on node \"crc\" DevicePath \"\"" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.253549 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.253708 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.322589 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.671868 4689 generic.go:334] "Generic (PLEG): container finished" podID="ddb6db14-07de-4191-9e3c-df8642dbd70a" containerID="91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435" exitCode=0 Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.672675 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7kgdr" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.673333 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kgdr" event={"ID":"ddb6db14-07de-4191-9e3c-df8642dbd70a","Type":"ContainerDied","Data":"91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435"} Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.673387 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kgdr" event={"ID":"ddb6db14-07de-4191-9e3c-df8642dbd70a","Type":"ContainerDied","Data":"f9b071b24a8e383bb10d29c17377478e51fcab2ff388c520186540bd420fcd0d"} Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.673413 4689 scope.go:117] "RemoveContainer" containerID="91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.698974 4689 scope.go:117] "RemoveContainer" containerID="95b3d517eeb1188ccdcb57cc328b31619bed96449eae9bfd87bd8c09ec6d1d8a" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.703609 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7kgdr"] Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.711914 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7kgdr"] Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.720280 4689 scope.go:117] "RemoveContainer" containerID="4c680f4199e3351dbd361df944d6c246ad83153b806d83d6c03f48da31b3c887" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.720541 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.743866 4689 scope.go:117] "RemoveContainer" containerID="91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435" Mar 07 04:56:57 crc kubenswrapper[4689]: E0307 04:56:57.744354 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435\": container with ID starting with 91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435 not found: ID does not exist" containerID="91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.744388 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435"} err="failed to get container status \"91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435\": rpc error: code = NotFound desc = could not find container \"91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435\": container with ID starting with 91905f9e85865b70dc21ac39550ebcad7e8a050b518c31a5a7828186ca3a9435 not found: ID does not exist" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.744412 4689 scope.go:117] "RemoveContainer" containerID="95b3d517eeb1188ccdcb57cc328b31619bed96449eae9bfd87bd8c09ec6d1d8a" Mar 07 04:56:57 crc kubenswrapper[4689]: E0307 04:56:57.744780 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b3d517eeb1188ccdcb57cc328b31619bed96449eae9bfd87bd8c09ec6d1d8a\": container with ID starting with 95b3d517eeb1188ccdcb57cc328b31619bed96449eae9bfd87bd8c09ec6d1d8a not found: ID does not exist" containerID="95b3d517eeb1188ccdcb57cc328b31619bed96449eae9bfd87bd8c09ec6d1d8a" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.744893 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b3d517eeb1188ccdcb57cc328b31619bed96449eae9bfd87bd8c09ec6d1d8a"} err="failed to get container status \"95b3d517eeb1188ccdcb57cc328b31619bed96449eae9bfd87bd8c09ec6d1d8a\": rpc error: code = NotFound desc = could not find container \"95b3d517eeb1188ccdcb57cc328b31619bed96449eae9bfd87bd8c09ec6d1d8a\": container with ID starting with 95b3d517eeb1188ccdcb57cc328b31619bed96449eae9bfd87bd8c09ec6d1d8a not found: ID does not exist" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.744983 4689 scope.go:117] "RemoveContainer" containerID="4c680f4199e3351dbd361df944d6c246ad83153b806d83d6c03f48da31b3c887" Mar 07 04:56:57 crc kubenswrapper[4689]: E0307 04:56:57.745533 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c680f4199e3351dbd361df944d6c246ad83153b806d83d6c03f48da31b3c887\": container with ID starting with 4c680f4199e3351dbd361df944d6c246ad83153b806d83d6c03f48da31b3c887 not found: ID does not exist" containerID="4c680f4199e3351dbd361df944d6c246ad83153b806d83d6c03f48da31b3c887" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.745563 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c680f4199e3351dbd361df944d6c246ad83153b806d83d6c03f48da31b3c887"} err="failed to get container status \"4c680f4199e3351dbd361df944d6c246ad83153b806d83d6c03f48da31b3c887\": rpc error: code = NotFound desc = could not find container \"4c680f4199e3351dbd361df944d6c246ad83153b806d83d6c03f48da31b3c887\": container with ID starting with 4c680f4199e3351dbd361df944d6c246ad83153b806d83d6c03f48da31b3c887 not found: ID does not exist" Mar 07 04:56:57 crc kubenswrapper[4689]: I0307 04:56:57.833418 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddb6db14-07de-4191-9e3c-df8642dbd70a" path="/var/lib/kubelet/pods/ddb6db14-07de-4191-9e3c-df8642dbd70a/volumes" Mar 07 04:56:59 crc kubenswrapper[4689]: I0307 04:56:59.190098 4689 patch_prober.go:28] interesting pod/machine-config-daemon-dss5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 04:56:59 crc kubenswrapper[4689]: I0307 04:56:59.190200 4689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 04:56:59 crc kubenswrapper[4689]: I0307 04:56:59.190252 4689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" Mar 07 04:56:59 crc kubenswrapper[4689]: I0307 04:56:59.190993 4689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5e920e422f290f0adaacaa8069f13f665148bd13172616c95c274c51f3d032d"} pod="openshift-machine-config-operator/machine-config-daemon-dss5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 04:56:59 crc kubenswrapper[4689]: I0307 04:56:59.191087 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerName="machine-config-daemon" containerID="cri-o://a5e920e422f290f0adaacaa8069f13f665148bd13172616c95c274c51f3d032d" gracePeriod=600 Mar 07 04:56:59 crc kubenswrapper[4689]: E0307 04:56:59.312929 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:56:59 crc kubenswrapper[4689]: I0307 04:56:59.689422 4689 generic.go:334] "Generic (PLEG): container finished" podID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" containerID="a5e920e422f290f0adaacaa8069f13f665148bd13172616c95c274c51f3d032d" exitCode=0 Mar 07 04:56:59 crc kubenswrapper[4689]: I0307 04:56:59.689488 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" event={"ID":"e6e9469a-474b-45c6-b3bd-638cb7a2e226","Type":"ContainerDied","Data":"a5e920e422f290f0adaacaa8069f13f665148bd13172616c95c274c51f3d032d"} Mar 07 04:56:59 crc kubenswrapper[4689]: I0307 04:56:59.689573 4689 scope.go:117] "RemoveContainer" containerID="fb4b2124c937a1be1975e56b486e30c77111eb4ae794931d0065d01fdf7d1cc6" Mar 07 04:56:59 crc kubenswrapper[4689]: I0307 04:56:59.690337 4689 scope.go:117] "RemoveContainer" containerID="a5e920e422f290f0adaacaa8069f13f665148bd13172616c95c274c51f3d032d" Mar 07 04:56:59 crc kubenswrapper[4689]: E0307 04:56:59.690710 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:56:59 crc kubenswrapper[4689]: I0307 04:56:59.730093 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzmrx"] Mar 07 04:56:59 crc kubenswrapper[4689]: I0307 04:56:59.730321 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pzmrx" podUID="d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" containerName="registry-server" containerID="cri-o://9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad" gracePeriod=2 Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.133885 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.258726 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-catalog-content\") pod \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\" (UID: \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\") " Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.258846 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvvpn\" (UniqueName: \"kubernetes.io/projected/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-kube-api-access-kvvpn\") pod \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\" (UID: \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\") " Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.258897 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-utilities\") pod \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\" (UID: \"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9\") " Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.259877 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-utilities" (OuterVolumeSpecName: "utilities") pod "d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" (UID: "d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.267689 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-kube-api-access-kvvpn" (OuterVolumeSpecName: "kube-api-access-kvvpn") pod "d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" (UID: "d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9"). InnerVolumeSpecName "kube-api-access-kvvpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.326395 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" (UID: "d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.359956 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.359993 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.360005 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvvpn\" (UniqueName: \"kubernetes.io/projected/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9-kube-api-access-kvvpn\") on node \"crc\" DevicePath \"\"" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.703113 4689 generic.go:334] "Generic (PLEG): container finished" podID="d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" containerID="9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad" exitCode=0 Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.703164 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzmrx" event={"ID":"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9","Type":"ContainerDied","Data":"9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad"} Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.703229 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzmrx" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.703281 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzmrx" event={"ID":"d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9","Type":"ContainerDied","Data":"891504d2d03967b19e8f836f8c6e6a75883ef97407beb69c94ba115f43243f0f"} Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.703323 4689 scope.go:117] "RemoveContainer" containerID="9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.730899 4689 scope.go:117] "RemoveContainer" containerID="253036035be2b95b1ef0508ca90c1332d3d049897178e4cbfc5116aa80a5334f" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.732636 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzmrx"] Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.745196 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzmrx"] Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.772663 4689 scope.go:117] "RemoveContainer" containerID="d8bbb9ebef15135af1be573298dd099b708139b8185ce029561247dc4f18049a" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.794856 4689 scope.go:117] "RemoveContainer" containerID="9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad" Mar 07 04:57:00 crc kubenswrapper[4689]: E0307 04:57:00.795553 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad\": container with ID starting with 9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad not found: ID does not exist" containerID="9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.795623 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad"} err="failed to get container status \"9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad\": rpc error: code = NotFound desc = could not find container \"9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad\": container with ID starting with 9d6ab3215c1ca5b61f6b7696d5645d38f02db3bca7a26f3f09d0d0ff3a69dfad not found: ID does not exist" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.795670 4689 scope.go:117] "RemoveContainer" containerID="253036035be2b95b1ef0508ca90c1332d3d049897178e4cbfc5116aa80a5334f" Mar 07 04:57:00 crc kubenswrapper[4689]: E0307 04:57:00.796152 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"253036035be2b95b1ef0508ca90c1332d3d049897178e4cbfc5116aa80a5334f\": container with ID starting with 253036035be2b95b1ef0508ca90c1332d3d049897178e4cbfc5116aa80a5334f not found: ID does not exist" containerID="253036035be2b95b1ef0508ca90c1332d3d049897178e4cbfc5116aa80a5334f" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.796202 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253036035be2b95b1ef0508ca90c1332d3d049897178e4cbfc5116aa80a5334f"} err="failed to get container status \"253036035be2b95b1ef0508ca90c1332d3d049897178e4cbfc5116aa80a5334f\": rpc error: code = NotFound desc = could not find container \"253036035be2b95b1ef0508ca90c1332d3d049897178e4cbfc5116aa80a5334f\": container with ID starting with 253036035be2b95b1ef0508ca90c1332d3d049897178e4cbfc5116aa80a5334f not found: ID does not exist" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.796236 4689 scope.go:117] "RemoveContainer" containerID="d8bbb9ebef15135af1be573298dd099b708139b8185ce029561247dc4f18049a" Mar 07 04:57:00 crc kubenswrapper[4689]: E0307 04:57:00.796657 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8bbb9ebef15135af1be573298dd099b708139b8185ce029561247dc4f18049a\": container with ID starting with d8bbb9ebef15135af1be573298dd099b708139b8185ce029561247dc4f18049a not found: ID does not exist" containerID="d8bbb9ebef15135af1be573298dd099b708139b8185ce029561247dc4f18049a" Mar 07 04:57:00 crc kubenswrapper[4689]: I0307 04:57:00.796702 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8bbb9ebef15135af1be573298dd099b708139b8185ce029561247dc4f18049a"} err="failed to get container status \"d8bbb9ebef15135af1be573298dd099b708139b8185ce029561247dc4f18049a\": rpc error: code = NotFound desc = could not find container \"d8bbb9ebef15135af1be573298dd099b708139b8185ce029561247dc4f18049a\": container with ID starting with d8bbb9ebef15135af1be573298dd099b708139b8185ce029561247dc4f18049a not found: ID does not exist" Mar 07 04:57:01 crc kubenswrapper[4689]: I0307 04:57:01.840544 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" path="/var/lib/kubelet/pods/d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9/volumes" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.128710 4689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-djn5b"] Mar 07 04:57:04 crc kubenswrapper[4689]: E0307 04:57:04.129288 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" containerName="extract-utilities" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.129303 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" containerName="extract-utilities" Mar 07 04:57:04 crc kubenswrapper[4689]: E0307 04:57:04.129327 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" containerName="extract-content" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.129334 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" containerName="extract-content" Mar 07 04:57:04 crc kubenswrapper[4689]: E0307 04:57:04.129348 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb6db14-07de-4191-9e3c-df8642dbd70a" containerName="extract-utilities" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.129551 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb6db14-07de-4191-9e3c-df8642dbd70a" containerName="extract-utilities" Mar 07 04:57:04 crc kubenswrapper[4689]: E0307 04:57:04.129561 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" containerName="registry-server" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.129569 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" containerName="registry-server" Mar 07 04:57:04 crc kubenswrapper[4689]: E0307 04:57:04.129580 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb6db14-07de-4191-9e3c-df8642dbd70a" containerName="extract-content" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.129587 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb6db14-07de-4191-9e3c-df8642dbd70a" containerName="extract-content" Mar 07 04:57:04 crc kubenswrapper[4689]: E0307 04:57:04.129596 4689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb6db14-07de-4191-9e3c-df8642dbd70a" containerName="registry-server" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.129603 4689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb6db14-07de-4191-9e3c-df8642dbd70a" containerName="registry-server" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.129757 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddb6db14-07de-4191-9e3c-df8642dbd70a" containerName="registry-server" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.129770 4689 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ed8dfa-10ee-4fc3-bf1b-27bf594e20e9" containerName="registry-server" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.130720 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.161270 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-djn5b"] Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.219777 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fb24cf-51fa-412f-ba0d-76117c9e2949-utilities\") pod \"certified-operators-djn5b\" (UID: \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\") " pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.219836 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fb24cf-51fa-412f-ba0d-76117c9e2949-catalog-content\") pod \"certified-operators-djn5b\" (UID: \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\") " pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.219860 4689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsncd\" (UniqueName: \"kubernetes.io/projected/d2fb24cf-51fa-412f-ba0d-76117c9e2949-kube-api-access-nsncd\") pod \"certified-operators-djn5b\" (UID: \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\") " pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.321135 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fb24cf-51fa-412f-ba0d-76117c9e2949-utilities\") pod \"certified-operators-djn5b\" (UID: \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\") " pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.321205 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fb24cf-51fa-412f-ba0d-76117c9e2949-catalog-content\") pod \"certified-operators-djn5b\" (UID: \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\") " pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.321239 4689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsncd\" (UniqueName: \"kubernetes.io/projected/d2fb24cf-51fa-412f-ba0d-76117c9e2949-kube-api-access-nsncd\") pod \"certified-operators-djn5b\" (UID: \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\") " pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.321740 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fb24cf-51fa-412f-ba0d-76117c9e2949-utilities\") pod \"certified-operators-djn5b\" (UID: \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\") " pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.321791 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fb24cf-51fa-412f-ba0d-76117c9e2949-catalog-content\") pod \"certified-operators-djn5b\" (UID: \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\") " pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.347871 4689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsncd\" (UniqueName: \"kubernetes.io/projected/d2fb24cf-51fa-412f-ba0d-76117c9e2949-kube-api-access-nsncd\") pod \"certified-operators-djn5b\" (UID: \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\") " pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.448549 4689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.663323 4689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-djn5b"] Mar 07 04:57:04 crc kubenswrapper[4689]: W0307 04:57:04.668655 4689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2fb24cf_51fa_412f_ba0d_76117c9e2949.slice/crio-40da4340a9f40e5c6eb9c73eda9f8da6ed210455585d2f1b5a53f5add95e79c6 WatchSource:0}: Error finding container 40da4340a9f40e5c6eb9c73eda9f8da6ed210455585d2f1b5a53f5add95e79c6: Status 404 returned error can't find the container with id 40da4340a9f40e5c6eb9c73eda9f8da6ed210455585d2f1b5a53f5add95e79c6 Mar 07 04:57:04 crc kubenswrapper[4689]: I0307 04:57:04.735009 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djn5b" event={"ID":"d2fb24cf-51fa-412f-ba0d-76117c9e2949","Type":"ContainerStarted","Data":"40da4340a9f40e5c6eb9c73eda9f8da6ed210455585d2f1b5a53f5add95e79c6"} Mar 07 04:57:05 crc kubenswrapper[4689]: I0307 04:57:05.743792 4689 generic.go:334] "Generic (PLEG): container finished" podID="d2fb24cf-51fa-412f-ba0d-76117c9e2949" containerID="b1d7505d88144ac53a9d4d4a2e6f7426644947f86b8d03832f3f3e507616301f" exitCode=0 Mar 07 04:57:05 crc kubenswrapper[4689]: I0307 04:57:05.743834 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djn5b" event={"ID":"d2fb24cf-51fa-412f-ba0d-76117c9e2949","Type":"ContainerDied","Data":"b1d7505d88144ac53a9d4d4a2e6f7426644947f86b8d03832f3f3e507616301f"} Mar 07 04:57:06 crc kubenswrapper[4689]: I0307 04:57:06.752344 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djn5b" event={"ID":"d2fb24cf-51fa-412f-ba0d-76117c9e2949","Type":"ContainerStarted","Data":"aefd1225ad9c3fb7c8e35a0d3aecb05ba7358641b5a769c29340f9823c403ce2"} Mar 07 04:57:07 crc kubenswrapper[4689]: I0307 04:57:07.761315 4689 generic.go:334] "Generic (PLEG): container finished" podID="d2fb24cf-51fa-412f-ba0d-76117c9e2949" containerID="aefd1225ad9c3fb7c8e35a0d3aecb05ba7358641b5a769c29340f9823c403ce2" exitCode=0 Mar 07 04:57:07 crc kubenswrapper[4689]: I0307 04:57:07.761375 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djn5b" event={"ID":"d2fb24cf-51fa-412f-ba0d-76117c9e2949","Type":"ContainerDied","Data":"aefd1225ad9c3fb7c8e35a0d3aecb05ba7358641b5a769c29340f9823c403ce2"} Mar 07 04:57:08 crc kubenswrapper[4689]: I0307 04:57:08.771459 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djn5b" event={"ID":"d2fb24cf-51fa-412f-ba0d-76117c9e2949","Type":"ContainerStarted","Data":"525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141"} Mar 07 04:57:08 crc kubenswrapper[4689]: I0307 04:57:08.793690 4689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-djn5b" podStartSLOduration=2.392441372 podStartE2EDuration="4.793670664s" podCreationTimestamp="2026-03-07 04:57:04 +0000 UTC" firstStartedPulling="2026-03-07 04:57:05.74587172 +0000 UTC m=+2270.792255199" lastFinishedPulling="2026-03-07 04:57:08.147101002 +0000 UTC m=+2273.193484491" observedRunningTime="2026-03-07 04:57:08.790056776 +0000 UTC m=+2273.836440355" watchObservedRunningTime="2026-03-07 04:57:08.793670664 +0000 UTC m=+2273.840054163" Mar 07 04:57:11 crc kubenswrapper[4689]: I0307 04:57:11.826560 4689 scope.go:117] "RemoveContainer" containerID="a5e920e422f290f0adaacaa8069f13f665148bd13172616c95c274c51f3d032d" Mar 07 04:57:11 crc kubenswrapper[4689]: E0307 04:57:11.827364 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226" Mar 07 04:57:14 crc kubenswrapper[4689]: I0307 04:57:14.449409 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:14 crc kubenswrapper[4689]: I0307 04:57:14.449802 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:14 crc kubenswrapper[4689]: I0307 04:57:14.492780 4689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:14 crc kubenswrapper[4689]: I0307 04:57:14.889534 4689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:16 crc kubenswrapper[4689]: I0307 04:57:16.353111 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-djn5b"] Mar 07 04:57:17 crc kubenswrapper[4689]: I0307 04:57:17.836023 4689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-djn5b" podUID="d2fb24cf-51fa-412f-ba0d-76117c9e2949" containerName="registry-server" containerID="cri-o://525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141" gracePeriod=2 Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.247667 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.336532 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsncd\" (UniqueName: \"kubernetes.io/projected/d2fb24cf-51fa-412f-ba0d-76117c9e2949-kube-api-access-nsncd\") pod \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\" (UID: \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\") " Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.336884 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fb24cf-51fa-412f-ba0d-76117c9e2949-catalog-content\") pod \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\" (UID: \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\") " Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.336924 4689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fb24cf-51fa-412f-ba0d-76117c9e2949-utilities\") pod \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\" (UID: \"d2fb24cf-51fa-412f-ba0d-76117c9e2949\") " Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.338055 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2fb24cf-51fa-412f-ba0d-76117c9e2949-utilities" (OuterVolumeSpecName: "utilities") pod "d2fb24cf-51fa-412f-ba0d-76117c9e2949" (UID: "d2fb24cf-51fa-412f-ba0d-76117c9e2949"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.344995 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2fb24cf-51fa-412f-ba0d-76117c9e2949-kube-api-access-nsncd" (OuterVolumeSpecName: "kube-api-access-nsncd") pod "d2fb24cf-51fa-412f-ba0d-76117c9e2949" (UID: "d2fb24cf-51fa-412f-ba0d-76117c9e2949"). InnerVolumeSpecName "kube-api-access-nsncd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.405055 4689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2fb24cf-51fa-412f-ba0d-76117c9e2949-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2fb24cf-51fa-412f-ba0d-76117c9e2949" (UID: "d2fb24cf-51fa-412f-ba0d-76117c9e2949"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.438038 4689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsncd\" (UniqueName: \"kubernetes.io/projected/d2fb24cf-51fa-412f-ba0d-76117c9e2949-kube-api-access-nsncd\") on node \"crc\" DevicePath \"\"" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.438094 4689 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fb24cf-51fa-412f-ba0d-76117c9e2949-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.438112 4689 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fb24cf-51fa-412f-ba0d-76117c9e2949-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.846525 4689 generic.go:334] "Generic (PLEG): container finished" podID="d2fb24cf-51fa-412f-ba0d-76117c9e2949" containerID="525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141" exitCode=0 Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.846604 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djn5b" event={"ID":"d2fb24cf-51fa-412f-ba0d-76117c9e2949","Type":"ContainerDied","Data":"525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141"} Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.846652 4689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djn5b" event={"ID":"d2fb24cf-51fa-412f-ba0d-76117c9e2949","Type":"ContainerDied","Data":"40da4340a9f40e5c6eb9c73eda9f8da6ed210455585d2f1b5a53f5add95e79c6"} Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.846691 4689 scope.go:117] "RemoveContainer" containerID="525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.846897 4689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djn5b" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.870670 4689 scope.go:117] "RemoveContainer" containerID="aefd1225ad9c3fb7c8e35a0d3aecb05ba7358641b5a769c29340f9823c403ce2" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.897324 4689 scope.go:117] "RemoveContainer" containerID="b1d7505d88144ac53a9d4d4a2e6f7426644947f86b8d03832f3f3e507616301f" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.900998 4689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-djn5b"] Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.906591 4689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-djn5b"] Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.918076 4689 scope.go:117] "RemoveContainer" containerID="525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141" Mar 07 04:57:18 crc kubenswrapper[4689]: E0307 04:57:18.918637 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141\": container with ID starting with 525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141 not found: ID does not exist" containerID="525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.918675 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141"} err="failed to get container status \"525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141\": rpc error: code = NotFound desc = could not find container \"525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141\": container with ID starting with 525c6b36011df9625fcdaf7d75f8820b0ef8d8e12a7a1665c91fb3b5a41f4141 not found: ID does not exist" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.918703 4689 scope.go:117] "RemoveContainer" containerID="aefd1225ad9c3fb7c8e35a0d3aecb05ba7358641b5a769c29340f9823c403ce2" Mar 07 04:57:18 crc kubenswrapper[4689]: E0307 04:57:18.919469 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aefd1225ad9c3fb7c8e35a0d3aecb05ba7358641b5a769c29340f9823c403ce2\": container with ID starting with aefd1225ad9c3fb7c8e35a0d3aecb05ba7358641b5a769c29340f9823c403ce2 not found: ID does not exist" containerID="aefd1225ad9c3fb7c8e35a0d3aecb05ba7358641b5a769c29340f9823c403ce2" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.919506 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aefd1225ad9c3fb7c8e35a0d3aecb05ba7358641b5a769c29340f9823c403ce2"} err="failed to get container status \"aefd1225ad9c3fb7c8e35a0d3aecb05ba7358641b5a769c29340f9823c403ce2\": rpc error: code = NotFound desc = could not find container \"aefd1225ad9c3fb7c8e35a0d3aecb05ba7358641b5a769c29340f9823c403ce2\": container with ID starting with aefd1225ad9c3fb7c8e35a0d3aecb05ba7358641b5a769c29340f9823c403ce2 not found: ID does not exist" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.919531 4689 scope.go:117] "RemoveContainer" containerID="b1d7505d88144ac53a9d4d4a2e6f7426644947f86b8d03832f3f3e507616301f" Mar 07 04:57:18 crc kubenswrapper[4689]: E0307 04:57:18.919811 4689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d7505d88144ac53a9d4d4a2e6f7426644947f86b8d03832f3f3e507616301f\": container with ID starting with b1d7505d88144ac53a9d4d4a2e6f7426644947f86b8d03832f3f3e507616301f not found: ID does not exist" containerID="b1d7505d88144ac53a9d4d4a2e6f7426644947f86b8d03832f3f3e507616301f" Mar 07 04:57:18 crc kubenswrapper[4689]: I0307 04:57:18.919837 4689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d7505d88144ac53a9d4d4a2e6f7426644947f86b8d03832f3f3e507616301f"} err="failed to get container status \"b1d7505d88144ac53a9d4d4a2e6f7426644947f86b8d03832f3f3e507616301f\": rpc error: code = NotFound desc = could not find container \"b1d7505d88144ac53a9d4d4a2e6f7426644947f86b8d03832f3f3e507616301f\": container with ID starting with b1d7505d88144ac53a9d4d4a2e6f7426644947f86b8d03832f3f3e507616301f not found: ID does not exist" Mar 07 04:57:19 crc kubenswrapper[4689]: I0307 04:57:19.837948 4689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2fb24cf-51fa-412f-ba0d-76117c9e2949" path="/var/lib/kubelet/pods/d2fb24cf-51fa-412f-ba0d-76117c9e2949/volumes" Mar 07 04:57:24 crc kubenswrapper[4689]: E0307 04:57:24.932391 4689 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Mar 07 04:57:24 crc kubenswrapper[4689]: E0307 04:57:24.933389 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:59:26.933349828 +0000 UTC m=+2411.979733357 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config-secret") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : secret "openstack-config-secret" not found Mar 07 04:57:24 crc kubenswrapper[4689]: E0307 04:57:24.932511 4689 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Mar 07 04:57:24 crc kubenswrapper[4689]: E0307 04:57:24.933502 4689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config podName:3c25a937-0d93-4077-92d7-fbeac4f6abb3 nodeName:}" failed. No retries permitted until 2026-03-07 04:59:26.933474931 +0000 UTC m=+2411.979858460 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/3c25a937-0d93-4077-92d7-fbeac4f6abb3-openstack-config") pod "openstackclient" (UID: "3c25a937-0d93-4077-92d7-fbeac4f6abb3") : configmap "openstack-config" not found Mar 07 04:57:25 crc kubenswrapper[4689]: I0307 04:57:25.832127 4689 scope.go:117] "RemoveContainer" containerID="a5e920e422f290f0adaacaa8069f13f665148bd13172616c95c274c51f3d032d" Mar 07 04:57:25 crc kubenswrapper[4689]: E0307 04:57:25.832597 4689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dss5c_openshift-machine-config-operator(e6e9469a-474b-45c6-b3bd-638cb7a2e226)\"" pod="openshift-machine-config-operator/machine-config-daemon-dss5c" podUID="e6e9469a-474b-45c6-b3bd-638cb7a2e226"